ABSTRACT
Emotional expression when performing music (singing or playing musical instruments) requires skill, but such a skill is generally difficult to learn. Computer systems that can make it easy for non-musicians to express any emotion have been proposed[1]. These systems can be used to express five or six emotions during a musical performance, but cannot be used to control the degree of an emotion such as savage or calm anger. It is necessary for the user, not only musicians but also non-musicians, to continuously manipulate emotions with immediate results for the audience. Therefore, we propose a system for controlling degrees of emotions in MIDI files. We call our proposed system Mood Operator Realized as an Application of Affective Rendering Techniques (MOR2ART), and it is designed to control expressed emotion during a musical performance using excerpts of a standard MIDI file (SMF) format. In musical performances, an emotion is expressed by the use of several performance profiles [2]. An emotion plane, which was defined in a previous study, is used in our system to allow manipulation of a pointer for continuously changing several performance profiles, such as timbre, tempo, number of performance tracks, and loudness of a given excerpt in that plane. Therefore, users can easily control the emotional expression in an excerpt. The emotions are expressed in the music when played back to the listener. Listeners can easily identify the expressed emotion with this playback. In an experimental evaluation, we confirmed that MOR2ART enables a non-musician to express emotion through his/her performance.
- R. Bresin, A. Friberg, "Emotional Coloring of Computer-Controlled Music Performances", proc. of Computer Music Journal, Vol. 24, No. 4, pp. 44--63 (2000) Google ScholarDigital Library
- P. N. Juslin, J. A. Slobda, "Music and Emotion", Oxford University Press, (2001).Google Scholar
- T. Yamasaki, "Emotional communication through music performance played by young Children", Proc. of ICMPC8, pp. 204--206, (2004).Google Scholar
- R. Plutchik, "The psychology and biology of emotion", New York, Hurper -- Collins (1994).Google Scholar
- R. Bresin, A. Friberg, "Emotional Coloring of Computer-Controlled Music Performances", proc. of Computer Music Journal, pp. 44--63, (2000). Google ScholarDigital Library
Index Terms
Emotion control system for MIDI excerpts: MOR2ART
Recommendations
EmoteControl: A System for Live-Manipulation of Emotional Cues in Music
AM '19: Proceedings of the 14th International Audio Mostly Conference: A Journey in SoundNumerous computer systems have been designed for music emotion research, aiming to identify how different structural and expressive cues of a musical piece affect the emotion conveyed by the music and perceived by the listener. However, most systems are ...
MIDI Motion: Interactive Music Composition Gloves
TEI '17: Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied InteractionMidi Motion Gloves is an interactive wearable that is used to manipulate, organize, and construct audio patterns for the purpose of composing music. Acting as an interface between the user and a Digital Audio Workspace, or DAW, the gloves give a user ...
The MIDI pick: trigger serial data, samples, and MIDI from a guitar pick
NIME '07: Proceedings of the 7th international conference on New interfaces for musical expressionThe guitar pick has traditionally been used to strike or rake the strings of a guitar or bass, and in rarer instances, a shamisen, lute, or other stringed instrument. The pressure exerted on it, however, has until now been ignored. The MIDI Pick, an ...
Comments