Emi Wada’s research while affiliated with Ryukoku University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


Emotion control system for MIDI excerpts: MOR2ART
  • Conference Paper

September 2010

·

50 Reads

·

6 Citations

Noritaka Moriguchi

·

Emi Wada

·

Masanobu Miura

Emotional expression when performing music (singing or playing musical instruments) requires skill, but such a skill is generally difficult to learn. Computer systems that can make it easy for non-musicians to express any emotion have been proposed[1]. These systems can be used to express five or six emotions during a musical performance, but cannot be used to control the degree of an emotion such as savage or calm anger. It is necessary for the user, not only musicians but also non-musicians, to continuously manipulate emotions with immediate results for the audience. Therefore, we propose a system for controlling degrees of emotions in MIDI files. We call our proposed system Mood Operator Realized as an Application of Affective Rendering Techniques (MOR2ART), and it is designed to control expressed emotion during a musical performance using excerpts of a standard MIDI file (SMF) format. In musical performances, an emotion is expressed by the use of several performance profiles [2]. An emotion plane, which was defined in a previous study, is used in our system to allow manipulation of a pointer for continuously changing several performance profiles, such as timbre, tempo, number of performance tracks, and loudness of a given excerpt in that plane. Therefore, users can easily control the emotional expression in an excerpt. The emotions are expressed in the music when played back to the listener. Listeners can easily identify the expressed emotion with this playback. In an experimental evaluation, we confirmed that MOR2ART enables a non-musician to express emotion through his/her performance.

Citations (1)


... Another study revealed the common feature of drum rhythm patterns [4], and several studies on musical arrangement have been conducted [5][6][7]. Other studies have been reported on the handling of a large volume of musical data [8,9], and by using such data, an automatic performance system for controlling musical emotion was developed [10]. However, no databases have been reported for analyzing ''onset,'' ''interval,'' or ''dynamics'' profiles for the bass guitar parts because no methods have been developed for identifying the Musical Instrument Digital Interface (MIDI) track of the bass guitar from MIDI excerpts. ...

Reference:

Automatic arrangement for the bass guitar in popular music using principle component analysis
Emotion control system for MIDI excerpts: MOR2ART
  • Citing Conference Paper
  • September 2010