ABSTRACT
In many Western performing art traditions, audiences have been ascribed a position of receiver not bearing creative roles. Such perception of audiences sees a radical shift in contemporary participatory art which fosters a greater creative involvement and appropriation of audiences. In this paper, we augment the music listening experience through sound-reactive visuals which vary according to listeners' arousal response. Our approach seeks to involve audiences in the live generation of visuals accompanying music, giving participants an indirect, yet creative, role. We present an implementation of our concept, Mood Visualiser, a web application which receives data from a wireless bio-sensing platform used to characterise the arousal response of listeners at the physiological level. Mood Visualiser provides an artistic visual representation of the music and the listening experience derived from both the audio and physiological domains. We describe and evaluate a use case that focuses on a listener's electro-dermal activity (EDA), a correlate of arousal (or excitement). Feedback received in user surveys was overall positive and we identified further design challenges around the visual expression of emotions perceived in music, and the suitability of sensor interfaces during the music listening activity.
- A. Allik, G. Fazekas, M. Barthet, and M. Sandler. 2016. myMoodplay: an interactive mood-based music discovery app. In Proc. of the 2nd Web Audio Conference (WAC). Atlanta, GA, USA.Google Scholar
- David Baker and Daniel Müllensiefen. 2016. Hearing Wagner: Physiological Responses to Richard Wagner's Der Ring des Nibelungen. In Proceedings of 14th International Conference for Music Cognition and Perception. San Francisco, CA.Google Scholar
- M. Barthet, G. Fazekas, A. Allik, F. Thalmann, and M. B. Sandler. 2016. From interactive to adaptive mood-based music listening experiences in social or personal contexts. J. Audio Engineering Society 64, 9 (2016), 673--682.Google ScholarCross Ref
- Mathieu Barthet, György Fazekas, and Mark Sandler. 2012. Multidisciplinary perspectives on music emotion recognition: Implications for content and context-based models. In Proc. CMMR. 492--507. http://cmmr2012.eecs.qmul.ac.uk/sites/cmmr2012.eecs.qmul.ac.uk/files/pdf/papers/cmmr2012_submission_101.pdfGoogle Scholar
- Margaret M. Bradley and Peter J. Lang. 2000. Affective reactions to acoustic stimuli. Psychophysiology 37, 2 (2000), 204--215.Google ScholarCross Ref
- Jason J Braithwaite, Derrick G Watson, Robert Jones, and Mickey Rowe. 2013. A Guide for Analysing Electrodermal Activity (EDA) & Skin Conductance Responses (SCRs) for Psychological Experiments. Technical Report. Selective Attention & Awareness Laboratory (SAAL) Behavioural Brain Sciences Centre, University of Birmingham, UK.Google Scholar
- Alberto Cairo. 2012. The Functional Art: An Introduction to Information Graphics and Visualization (1st ed.). New Riders Publishing, Thousand Oaks, CA, USA. Google ScholarDigital Library
- C. Chen. 2005. Top 10 unsolved information visualization problems. IEEE Computer Graphics and Applications 25, 4 (July 2005), 12--16. Google ScholarDigital Library
- Elaine Chew and Alexandre R. J. François. 2003. MuSA.RT: Music on the Spiral Array. Real-Time. In ACM Multimedia (ACM MM). Google ScholarDigital Library
- Umberto Eco. 1989. The Open Work (Translated by Anna Cancogni With an Introduction by David Robey). Harvard University Press, Cambridge, Massachusetts, USA.Google Scholar
- P. Ekman and W. V. Friesen. 1978. Facial Action Coding System. Consulting Psychologists Press, Palo Alto, CA.Google Scholar
- G. Fazekas, M. Barthet, and Mark B. Sandler. 2013. The Mood Conductor System: Audience and Performer Interaction using Mobile Technology and Emotion Cues. In Proc. of the Int. Symposium on Computer Music Multidisciplinary Research (CMMR). http://www.cmmr2013.cnrs-mrs.fr/Docs/CMMR2013Proceedings.pdfGoogle Scholar
- J. Fleureau, P. Guillotel, and I. Orlac. 2013. Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience. In 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. 73--78. Google ScholarDigital Library
- Rumi Hiraga, Reiko Mizaki, and Issei Fujishiro. 2002. Performance Visualization: A New Challenge to Music Through Visualization. In Proceedings of the Tenth ACM International Conference on Multimedia (MULTIMEDIA '02). ACM, New York, NY, USA, 239--242. Google ScholarDigital Library
- Patrik N. Juslin and Petri Laukka. 2004. Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening. Journal of New Music Research 33, 3 (2004), 217--238.Google ScholarCross Ref
- K. H. Kim, S. W. Bang, and S. R. Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and Biological Engineering and Computing 42, 3 (2004), 419--427.Google ScholarCross Ref
- Stefan Koelsch. 2014. Brain correlates of music-evoked emotions. Nat Rev Neurosci 15, 3 (03 2014), 170--180.Google Scholar
- B. Laposky. 1953. Electronic Abstractions. Laposky.Google Scholar
- J. Maeda. 2006. The Laws of Simplicity (Simplicity: Design, Technology, Business, Life). The MIT Press. http://www.amazon.ca/exec/obidos/redirect?tag=citeulike04-20 Google ScholarDigital Library
- Leonard B. Meyer. 1956. Emotion and Meaning in Music. University of Chicago Press.Google Scholar
- Reiko Miyazaki, Issei Fujishiro, and Rumi Hiraga. 2004. comp-i: A System for Visual Exploration and Editing of MIDI Datasets. In ICMC. Michigan Publishing. http://dblp.unitrier.de/db/conf/icmc/icmc2004.html#MiyazakiFH04Google Scholar
- Ireti Olowe, Mathieu Barthet, Mick Grierson, and Nick Bryan-Kinns. 2016. FEATUR.UX: An approach to leveraging multitrack information for artistic music visualization. In TENOR Conference.Google Scholar
- C. E. Osgood, G. J. Suci, and P. H. Tannenbaum. 1957. The measurement of meaning. University of Illinois Press, Urbana.Google Scholar
- M. Z. Poh, N. C. Swenson, and R. W. Picard. 2010. A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal Activity. IEEE Transactions on Biomedical Engineering 57, 5 (May 2010), 1243--1252.Google Scholar
- David Robertson. 2004. Primer on the autonomic nervous system (2nd ed. ed.). Academic Press, Boston. http://www.sciencedirect.com/science/book/9780125897624Google Scholar
- J. A. Russell. 1980. A circumplex model of affect. J. of Pers. and Social Psy. 39, 6 (1980), 1161--1178.Google ScholarCross Ref
- J. A. Sloboda and P. N. Juslin. 2001. Psychological perspectives on music and emotion. Oxford University Press, 71--104.Google Scholar
- K. Trochidis, D. Sears, D.-L. Tran, and S. McAdams. 2012. Psychophysiological measures of emotional response to Romantic orchestral music and their musical and acoustic correlates. In Proc. 9th Int. Symposium on Computer Music Modelling and Retrieval (CMMR).Google Scholar
- Edward R. Tufte. 1986. The Visual Display of Quantitative Information. Graphics Press, Cheshire, CT, USA. Google ScholarDigital Library
Index Terms
- Mood Visualiser: Augmented Music Visualisation Gauging Audience Arousal
Recommendations
Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework
Sound, Music, and MotionAbstractWhile listeners’ emotional response to music is the subject of numerous studies, less attention is paid to the dynamic emotion variations due to the interaction between artists and audiences in live improvised music performances. By opening a ...
Music Emotion Recognition: From Content- to Context-Based Models
CMMR 2012: Revised Selected Papers of the 9th International Symposium on From Sounds to Music and Emotions - Volume 7900The striking ability of music to elicit emotions assures its prominent status in human culture and every day life. Music is often enjoyed and sought for its ability to induce or convey emotions, which may manifest in anything from a slight variation in ...
Directing Physiology and Mood through Music: Validation of an Affective Music Player
Music is important in everyday life, as it provides entertainment and influences our moods. As music is widely available, it is becoming increasingly difficult to select songs to suit our mood. An affective music player can remove this obstacle by ...
Comments