skip to main content
10.1145/3123514.3123517acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
research-article

Mood Visualiser: Augmented Music Visualisation Gauging Audience Arousal

Authors Info & Claims
Published:23 August 2017Publication History

ABSTRACT

In many Western performing art traditions, audiences have been ascribed a position of receiver not bearing creative roles. Such perception of audiences sees a radical shift in contemporary participatory art which fosters a greater creative involvement and appropriation of audiences. In this paper, we augment the music listening experience through sound-reactive visuals which vary according to listeners' arousal response. Our approach seeks to involve audiences in the live generation of visuals accompanying music, giving participants an indirect, yet creative, role. We present an implementation of our concept, Mood Visualiser, a web application which receives data from a wireless bio-sensing platform used to characterise the arousal response of listeners at the physiological level. Mood Visualiser provides an artistic visual representation of the music and the listening experience derived from both the audio and physiological domains. We describe and evaluate a use case that focuses on a listener's electro-dermal activity (EDA), a correlate of arousal (or excitement). Feedback received in user surveys was overall positive and we identified further design challenges around the visual expression of emotions perceived in music, and the suitability of sensor interfaces during the music listening activity.

References

  1. A. Allik, G. Fazekas, M. Barthet, and M. Sandler. 2016. myMoodplay: an interactive mood-based music discovery app. In Proc. of the 2nd Web Audio Conference (WAC). Atlanta, GA, USA.Google ScholarGoogle Scholar
  2. David Baker and Daniel Müllensiefen. 2016. Hearing Wagner: Physiological Responses to Richard Wagner's Der Ring des Nibelungen. In Proceedings of 14th International Conference for Music Cognition and Perception. San Francisco, CA.Google ScholarGoogle Scholar
  3. M. Barthet, G. Fazekas, A. Allik, F. Thalmann, and M. B. Sandler. 2016. From interactive to adaptive mood-based music listening experiences in social or personal contexts. J. Audio Engineering Society 64, 9 (2016), 673--682.Google ScholarGoogle ScholarCross RefCross Ref
  4. Mathieu Barthet, György Fazekas, and Mark Sandler. 2012. Multidisciplinary perspectives on music emotion recognition: Implications for content and context-based models. In Proc. CMMR. 492--507. http://cmmr2012.eecs.qmul.ac.uk/sites/cmmr2012.eecs.qmul.ac.uk/files/pdf/papers/cmmr2012_submission_101.pdfGoogle ScholarGoogle Scholar
  5. Margaret M. Bradley and Peter J. Lang. 2000. Affective reactions to acoustic stimuli. Psychophysiology 37, 2 (2000), 204--215.Google ScholarGoogle ScholarCross RefCross Ref
  6. Jason J Braithwaite, Derrick G Watson, Robert Jones, and Mickey Rowe. 2013. A Guide for Analysing Electrodermal Activity (EDA) & Skin Conductance Responses (SCRs) for Psychological Experiments. Technical Report. Selective Attention & Awareness Laboratory (SAAL) Behavioural Brain Sciences Centre, University of Birmingham, UK.Google ScholarGoogle Scholar
  7. Alberto Cairo. 2012. The Functional Art: An Introduction to Information Graphics and Visualization (1st ed.). New Riders Publishing, Thousand Oaks, CA, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. C. Chen. 2005. Top 10 unsolved information visualization problems. IEEE Computer Graphics and Applications 25, 4 (July 2005), 12--16. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Elaine Chew and Alexandre R. J. François. 2003. MuSA.RT: Music on the Spiral Array. Real-Time. In ACM Multimedia (ACM MM). Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Umberto Eco. 1989. The Open Work (Translated by Anna Cancogni With an Introduction by David Robey). Harvard University Press, Cambridge, Massachusetts, USA.Google ScholarGoogle Scholar
  11. P. Ekman and W. V. Friesen. 1978. Facial Action Coding System. Consulting Psychologists Press, Palo Alto, CA.Google ScholarGoogle Scholar
  12. G. Fazekas, M. Barthet, and Mark B. Sandler. 2013. The Mood Conductor System: Audience and Performer Interaction using Mobile Technology and Emotion Cues. In Proc. of the Int. Symposium on Computer Music Multidisciplinary Research (CMMR). http://www.cmmr2013.cnrs-mrs.fr/Docs/CMMR2013Proceedings.pdfGoogle ScholarGoogle Scholar
  13. J. Fleureau, P. Guillotel, and I. Orlac. 2013. Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience. In 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction. 73--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Rumi Hiraga, Reiko Mizaki, and Issei Fujishiro. 2002. Performance Visualization: A New Challenge to Music Through Visualization. In Proceedings of the Tenth ACM International Conference on Multimedia (MULTIMEDIA '02). ACM, New York, NY, USA, 239--242. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Patrik N. Juslin and Petri Laukka. 2004. Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening. Journal of New Music Research 33, 3 (2004), 217--238.Google ScholarGoogle ScholarCross RefCross Ref
  16. K. H. Kim, S. W. Bang, and S. R. Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and Biological Engineering and Computing 42, 3 (2004), 419--427.Google ScholarGoogle ScholarCross RefCross Ref
  17. Stefan Koelsch. 2014. Brain correlates of music-evoked emotions. Nat Rev Neurosci 15, 3 (03 2014), 170--180.Google ScholarGoogle Scholar
  18. B. Laposky. 1953. Electronic Abstractions. Laposky.Google ScholarGoogle Scholar
  19. J. Maeda. 2006. The Laws of Simplicity (Simplicity: Design, Technology, Business, Life). The MIT Press. http://www.amazon.ca/exec/obidos/redirect?tag=citeulike04-20 Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Leonard B. Meyer. 1956. Emotion and Meaning in Music. University of Chicago Press.Google ScholarGoogle Scholar
  21. Reiko Miyazaki, Issei Fujishiro, and Rumi Hiraga. 2004. comp-i: A System for Visual Exploration and Editing of MIDI Datasets. In ICMC. Michigan Publishing. http://dblp.unitrier.de/db/conf/icmc/icmc2004.html#MiyazakiFH04Google ScholarGoogle Scholar
  22. Ireti Olowe, Mathieu Barthet, Mick Grierson, and Nick Bryan-Kinns. 2016. FEATUR.UX: An approach to leveraging multitrack information for artistic music visualization. In TENOR Conference.Google ScholarGoogle Scholar
  23. C. E. Osgood, G. J. Suci, and P. H. Tannenbaum. 1957. The measurement of meaning. University of Illinois Press, Urbana.Google ScholarGoogle Scholar
  24. M. Z. Poh, N. C. Swenson, and R. W. Picard. 2010. A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal Activity. IEEE Transactions on Biomedical Engineering 57, 5 (May 2010), 1243--1252.Google ScholarGoogle Scholar
  25. David Robertson. 2004. Primer on the autonomic nervous system (2nd ed. ed.). Academic Press, Boston. http://www.sciencedirect.com/science/book/9780125897624Google ScholarGoogle Scholar
  26. J. A. Russell. 1980. A circumplex model of affect. J. of Pers. and Social Psy. 39, 6 (1980), 1161--1178.Google ScholarGoogle ScholarCross RefCross Ref
  27. J. A. Sloboda and P. N. Juslin. 2001. Psychological perspectives on music and emotion. Oxford University Press, 71--104.Google ScholarGoogle Scholar
  28. K. Trochidis, D. Sears, D.-L. Tran, and S. McAdams. 2012. Psychophysiological measures of emotional response to Romantic orchestral music and their musical and acoustic correlates. In Proc. 9th Int. Symposium on Computer Music Modelling and Retrieval (CMMR).Google ScholarGoogle Scholar
  29. Edward R. Tufte. 1986. The Visual Display of Quantitative Information. Graphics Press, Cheshire, CT, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Mood Visualiser: Augmented Music Visualisation Gauging Audience Arousal

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Other conferences
                AM '17: Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences
                August 2017
                337 pages
                ISBN:9781450353731
                DOI:10.1145/3123514

                Copyright © 2017 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 23 August 2017

                Permissions

                Request permissions about this article.

                Request Permissions

                Check for updates

                Qualifiers

                • research-article
                • Research
                • Refereed limited

                Acceptance Rates

                AM '17 Paper Acceptance Rate54of77submissions,70%Overall Acceptance Rate177of275submissions,64%

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader