Article

Audio-haptic interaction with modal synthesis models

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Using a musical instrument that is augmented with haptic feedback, a performer can be enabled to haptically interact with a modal synthesis-based sound synthesizer. This subject is explored using the resonators object in Synth-A-Modeler. For synthesizing a single mode of vibration, the resonators object should be configured with a single frequency in Hertz, a single T60 exponential decay time in seconds, and a single equivalent mass in kg. Changing the mass not only changes the level of the output sound, it also changes how the mode feels when touched using haptic feedback. The resonators object can further be configured to represent a driving-point admittance corresponding to arbitrarily many modes of vibration. In this case, each mode of vibration is specified by its own frequency, decay time, and equivalent mass. Since the modal parameters can be determined using an automated procedure, it is possible to (within limits) approximately calibrate modal models using recordings of sounds that decay approximately exponentially. Various model structures incorporating the resonators object are presented in a variety of contexts. The musical application of these models is demonstrated alongside presentation of compositions that use them.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

ResearchGate has not been able to resolve any citations for this publication.
ResearchGate has not been able to resolve any references for this publication.