When playing a musical instrument, a player perceives not only the sound generated, but also the haptic interaction, arising during the contact between player and instrument. Such haptic interaction, based on the sense of touch, involves several senses in the player: tactile, kinesthetic (i.e., mediated by end organs located in muscles, tendons, and joints and stimulated by bodily movements and tensions), proprioceptive (i.e., of, relating to, or being stimuli arising within the organism), etc. By its na- ture, the haptic interaction is bidirectional, and this is exploited by musical instrument players, who can better correlate their actions on the instrument to the sound generated. For instance, by paying atten- tion to the interaction force between key and finger, arising during the descent of the key, pianists can detect the re-triggering of the escapement mecha- nisms and, in turn, can adjust the key motion to ob- tain the fastest repetition of the note. Roughly speaking, haptic information allows the player to perceive the “state” of the mechanism be- ing manipulated through the key. By using this knowledge about the state of the mechanism and correlating it with the sound generated, the player learns a strategy to obtain desired tones. This tight correspondence between acoustic response and touch response, however, is lost in many electronic instruments (e.g., in standard commercial synthe- sizers), in which sound generation is related only to the key attack velocity and pressure. In this type of synthetic instrument, the touch feedback is inde- pendent of the instrument being simulated. For in- stance, the interaction with different instruments like harpsichord, piano, or pipe organ gives the same haptic information to the player. This constitutes a significant limitation for the musician, who loses expressive control of the instrument and, in turn, of the generated sound.This consideration led to several research activi- ties, aimed at the realization of an active keyboard, in which actuators connected to the keys are driven in such a way that the haptic interaction experi- enced is the same as if the player were interacting with the keyboard of the real instrument being em- ulated by the synthesizer (Baker 1988; Cadoz, Lisowski, and Florens 1990; Gillespie 1992; Gille- spie and Cutkosky 1992; Cadoz, Luciani, and Flo- rens 1993; Gillespie 1994). Such haptic displays are usually referred to as “virtual mechanisms,” because they are designed for the reproduction of the touch feedback that a user would experience when interacting with an actual multi-body mechanism. This very simple example can be extended to multi-body mechanisms, composed of several parts, which interact with one another in terms of impacts, constraints, etc. In such a case, the motion of each part of the virtual mechanism must be calculated by a dynamic simulator, which incorporates all the characteristics of the real mechanism and computes the interaction forces among the parts. It is worth noting that, at times, an overly detailed description of the real mechanism leads to a bulky dynamic simulator, not suitable for real-time implementation, as is required in haptic interaction. Moreover, it is usually difficult to tune the parameters of the dynamic simulator, especially when the mechanism to be simulated contains several non- linear components, such as nonlinear dampers or constraints. Among all the possible keyboard-operated instruments, the grand piano has by far the most complicated mechanism (Topper and Wills 1987). The grand piano action, in fact, is composed of dozens of components and this, as we mentioned, has impeded the realization of a real-time dynamic simulator for it. A remarkable work by Gillespie and Cutkosky (1992) shows how it is possible to implement a very detailed model of the piano action and tune it by matching simulation and experimental results, the latter obtained by accurately measuring all dynamic and kinematic variables on an actual piano mechanism. However, the obtained model, even if it results in good agreement with experimental data, can run only offline. Given these considerations, several researchers have focused their work on the reproduction of only one or a few specific behaviors of the mechanism. For instance, Baker (1988) proposes the simulation of user-programmable inertial and viscous characteristics to adapt the key- board to the player’s taste. Gillespie (1992, 1994), on the other hand, has studied the modeling of a simplified piano action, composed of only two bodies: the key and the hammer. Even with this very simple model, it is possible to reproduce part of the hammer motion, composed of three different phases: contact with the key, fly, and return on the key. This model, however, does not take into account the impact of the hammer with the string and the effect of escapement, even if such characteristics are very useful in regaining the previously mentioned correspondence between acoustic response and haptic interaction. This article presents the preliminary results obtained by the MIKEY (Multi-Instrument active KEYboard) project. The project is aimed at the realization of a multi-instrument active keyboard with realistic touch feedback. In particular, the instruments to be emulated are the grand piano, the harpsichord, and the Hammond organ. Given the previous consideration, it is clear that some tradeoff between model accuracy and real-time operability had to be made at the beginning of the project, especially for the grand piano. The research presented here started from the work of Gillespie and has been improved by adding some additional features, namely the hammer-string impact, various state- dependent hammer-key impacts, and the escapement effect. Also, to improve the quality of the haptic feedback, a direct-drive, low-friction motor has been used. Finally, particular attention has been paid to the cost of the overall system, by using inexpensive devices for sensing, actuation, and real-time computation. After introducing the models used in the dynamic simulator, the article describes the experimental setup realized. The experimental results obtained are then reported and compared with those obtained with a standard piano keyboard. Comments on the results presented conclude the article.

A Multi-Instrument, Force-Feedback Keyboard

OBOE, ROBERTO
2006

Abstract

When playing a musical instrument, a player perceives not only the sound generated, but also the haptic interaction, arising during the contact between player and instrument. Such haptic interaction, based on the sense of touch, involves several senses in the player: tactile, kinesthetic (i.e., mediated by end organs located in muscles, tendons, and joints and stimulated by bodily movements and tensions), proprioceptive (i.e., of, relating to, or being stimuli arising within the organism), etc. By its na- ture, the haptic interaction is bidirectional, and this is exploited by musical instrument players, who can better correlate their actions on the instrument to the sound generated. For instance, by paying atten- tion to the interaction force between key and finger, arising during the descent of the key, pianists can detect the re-triggering of the escapement mecha- nisms and, in turn, can adjust the key motion to ob- tain the fastest repetition of the note. Roughly speaking, haptic information allows the player to perceive the “state” of the mechanism be- ing manipulated through the key. By using this knowledge about the state of the mechanism and correlating it with the sound generated, the player learns a strategy to obtain desired tones. This tight correspondence between acoustic response and touch response, however, is lost in many electronic instruments (e.g., in standard commercial synthe- sizers), in which sound generation is related only to the key attack velocity and pressure. In this type of synthetic instrument, the touch feedback is inde- pendent of the instrument being simulated. For in- stance, the interaction with different instruments like harpsichord, piano, or pipe organ gives the same haptic information to the player. This constitutes a significant limitation for the musician, who loses expressive control of the instrument and, in turn, of the generated sound.This consideration led to several research activi- ties, aimed at the realization of an active keyboard, in which actuators connected to the keys are driven in such a way that the haptic interaction experi- enced is the same as if the player were interacting with the keyboard of the real instrument being em- ulated by the synthesizer (Baker 1988; Cadoz, Lisowski, and Florens 1990; Gillespie 1992; Gille- spie and Cutkosky 1992; Cadoz, Luciani, and Flo- rens 1993; Gillespie 1994). Such haptic displays are usually referred to as “virtual mechanisms,” because they are designed for the reproduction of the touch feedback that a user would experience when interacting with an actual multi-body mechanism. This very simple example can be extended to multi-body mechanisms, composed of several parts, which interact with one another in terms of impacts, constraints, etc. In such a case, the motion of each part of the virtual mechanism must be calculated by a dynamic simulator, which incorporates all the characteristics of the real mechanism and computes the interaction forces among the parts. It is worth noting that, at times, an overly detailed description of the real mechanism leads to a bulky dynamic simulator, not suitable for real-time implementation, as is required in haptic interaction. Moreover, it is usually difficult to tune the parameters of the dynamic simulator, especially when the mechanism to be simulated contains several non- linear components, such as nonlinear dampers or constraints. Among all the possible keyboard-operated instruments, the grand piano has by far the most complicated mechanism (Topper and Wills 1987). The grand piano action, in fact, is composed of dozens of components and this, as we mentioned, has impeded the realization of a real-time dynamic simulator for it. A remarkable work by Gillespie and Cutkosky (1992) shows how it is possible to implement a very detailed model of the piano action and tune it by matching simulation and experimental results, the latter obtained by accurately measuring all dynamic and kinematic variables on an actual piano mechanism. However, the obtained model, even if it results in good agreement with experimental data, can run only offline. Given these considerations, several researchers have focused their work on the reproduction of only one or a few specific behaviors of the mechanism. For instance, Baker (1988) proposes the simulation of user-programmable inertial and viscous characteristics to adapt the key- board to the player’s taste. Gillespie (1992, 1994), on the other hand, has studied the modeling of a simplified piano action, composed of only two bodies: the key and the hammer. Even with this very simple model, it is possible to reproduce part of the hammer motion, composed of three different phases: contact with the key, fly, and return on the key. This model, however, does not take into account the impact of the hammer with the string and the effect of escapement, even if such characteristics are very useful in regaining the previously mentioned correspondence between acoustic response and haptic interaction. This article presents the preliminary results obtained by the MIKEY (Multi-Instrument active KEYboard) project. The project is aimed at the realization of a multi-instrument active keyboard with realistic touch feedback. In particular, the instruments to be emulated are the grand piano, the harpsichord, and the Hammond organ. Given the previous consideration, it is clear that some tradeoff between model accuracy and real-time operability had to be made at the beginning of the project, especially for the grand piano. The research presented here started from the work of Gillespie and has been improved by adding some additional features, namely the hammer-string impact, various state- dependent hammer-key impacts, and the escapement effect. Also, to improve the quality of the haptic feedback, a direct-drive, low-friction motor has been used. Finally, particular attention has been paid to the cost of the overall system, by using inexpensive devices for sensing, actuation, and real-time computation. After introducing the models used in the dynamic simulator, the article describes the experimental setup realized. The experimental results obtained are then reported and compared with those obtained with a standard piano keyboard. Comments on the results presented conclude the article.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/127294
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 21
  • ???jsp.display-item.citation.isi??? 17
social impact