Conference Paper

A biped robot that keeps steps in time with musical beats while listening to music with its own ears

Kyoto Univ., Kyoto
DOI: 10.1109/IROS.2007.4399244 Conference: Intelligent Robots and Systems, 2007. IROS 2007. IEEE/RSJ International Conference on
Source: IEEE Xplore


We aim at enabling a biped robot to interact with humans through real-world music in daily-life environments, e.g., to autonomously keep its steps (stamps) in time with musical beats. To achieve this, the robot should be able to robustly predict the beat times in real time while listening to musical performance with its own ears (head-embedded microphones). However, this has not previously been addressed in most studies on music-synchronized robots due to the difficulty in predicting the beat times in real-world music. To solve this problem, we implemented a beat-tracking method developed in the field of music information processing. The predicted beat times are then used by a feedback-control method that adjusts the robot's step intervals to synchronize its steps in time with the beats. The experimental results show that the robot can adjust its steps in time with the beat times as the tempo changes. The resulting robot needed about 25 [s] to recognize the tempo change after it and then synchronize its steps.

Download full-text


Available from: Hiroshi G. Okuno,
  • Source
    • "and [49], and a feedback-control method was used to regulate the step intervals of the robot to keep its steps in time with musical beats [48] (see Fig. 8) that were predicted in a real-world environment by implementing a beat-tracking method based on [76]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Robotic dance is an important topic in the field of social robotics. Its research has a vital significance to both humans and robotics. This paper presents a review of the state of the art in robotic dance. Robotic dance is classified into four categories: cooperative human-robot dance, imitation of human dance motions, synchronization for music, and creation of robotic choreography. The research methods in each category are discussed. Future research areas are highlighted.
    IEEE Transactions on Human-Machine Systems 06/2015; 45(3):1-13. DOI:10.1109/THMS.2015.2393558 · 1.98 Impact Factor
  • Source
    • "Roger et al. developed a robotic bagpipe player [3]. Kazuyoshi et al. developed an algorithm that a humanoid robot can following musical beat [4]. Gil and Scott developed an interactive robotic percussionist [5]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: One interesting field of robotics technology is related to the entertainment industry. Performing a musical piece using a robot is a difficult task because music presents many features like melody, rhythm, tone, harmony and so on. Addressing these tasks with a robot is not trivial to implement. Most of approaches which related to this specific field lacks of quality to perform in front of human audience. Implementation of human-like motions can not be properly achieved with a conventional robot actuator. Consequently, we exploit a new type of actuator which simplifies the drawbacks of a conventional one. We used Variable Stiffness Actuator(VSA) instead of using conventional actuator. We can control position, force, and stiffness, simultaneously by using VSA. The most important novel feature is its controllable stiffness. When the stiffness of the actuator is changed, the characteristics of the actuator's response also changes. We implemented the specific stroke which is called " double stroke " using one of variable stiffness actuator. Although the double stroke is known as a special stroke which could be performed by human only, double stroke is successfully implemented by stiffness variation.
    2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014); 09/2014
  • Source
    • "Mizumoto et al. [10] developed a robot musician that mutually interacts with a human musician using musical instruments, such as a flute. Yoshii et al. and Murata et al. [11], [12] have described a humanoid robot that can sing and step to musical tempos using robot audition. Their work focuses on catching the target sound in a noisy environment including ego-noise, and differs from our goal on that point. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper presents the development toward a dancing robot that can listen to and dance along with musical performances. One of the key components of this robot is the ability to modify its dance motions with varying tempos, without exceeding motor limitations, in the same way that human dancers modify their motions. In this paper, we first observe human performances with varying musical tempos of the same musical piece, and then analyze human modification strategies. The analysis is conducted in terms of three body components: lower, middle, and upper bodies. We assume that these body components have different purposes and different modification strategies, respectively, for the performance of a dance. For all of the motions of these three components, we have found that certain fixed postures, which we call keyposes, tend to be preserved. Thus, this paper presents a method to create motions for robots at a certain music tempo, from human motion at an original music tempo, by using these keyposes. We have implemented these algorithms as an automatic process and validated their effectiveness by using a physical humanoid robot HRP-2. This robot succeeded in performing the Aizu-bandaisan dance, one of the Japanese traditional folk dances, 1.2 and 1.5 times faster than the tempo originally learned, while maintaining its physical constraints. Although we are not achieving a dancing robot which autonomously interacts with varying music tempos, we think that our method has a vital role in the dancing-to-music capability.
    IEEE Transactions on Robotics 06/2014; 30(3):771-778. DOI:10.1109/TRO.2014.2300212 · 2.43 Impact Factor
Show more