skip to main content
10.1145/2660168.2660183acmotherconferencesArticle/Chapter ViewAbstractPublication PagesdlfmConference Proceedingsconference-collections
research-article

Executable Music Documents

Authors Info & Claims
Published:12 September 2014Publication History

ABSTRACT

While good practices are emerging with respect to publication of data alongside research outputs, we argue that computational descriptions (e.g. scripts, software and workflows) should also be included so that research can be interpreted, reconstructed and recomputed. A research article---or Research Object---should then describe all the components associated with a piece of digital research, including the descriptions of code and algorithms, effectively comprising an executable document. Furthermore we observe that such a re-executable object can be re-run automatically. The Music Information Retrieval research community has established community infrastructure and practices which are amenable to this approach, providing a glimpse of a future Music Digital Library. These ideas raise a number of issues for Digital Libraries more generally.

References

  1. B. Buckheit and D. L. Donoho. 1995. WaveLab and Reproducible Research, Dept. of Statistics, Stanford University, Tech. Rep. 474.Google ScholarGoogle Scholar
  2. D.E. Knuth. 1984. Literate Programming. The Computer Journal. 27 (2): 97--111. British Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. F. Leisch. 2002. Sweave: Dynamic generation of statistical reports using literate data analysis. In W. Härdle and B. Rönz (editors), Compstat 2002 - Proceedings in Computational Statistics, pages 575--580. Physica Verlag, Heidelberg.Google ScholarGoogle Scholar
  4. A.J. Perlis. 1985. Foreword to Structure and Interpretation of Computer Programs by H. Abelson and G. Jay Sussman with J. Sussman. MIT Press (1st ed.).Google ScholarGoogle Scholar
  5. C. Cannam, L. A. Figueira and M. D. Plumbley. 2012. Sound software: Towards software reuse in audio and music research. In: Proc IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2012), Kyoto, Japan, pp 2745--2748, 25-30.Google ScholarGoogle Scholar
  6. D. De Roure and C. Goble. 2010. Anchors in Shifting Sand: the Primacy of Method in the Web of Data. In Proceedings of WebSci10, April 2010, Raleigh, North Carolina, US.Google ScholarGoogle Scholar
  7. S. Bechhofer, I. Buchan, D. De Roure, P. Missier et al. 2013. Why Linked Data is Not Enough for Scientists, Future Generation Computer Systems, Vol. 29, No. 2. pp. 599-611. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D. De Roure, C. Goble and R. Stevens. 2009. The design and realisation of the myExperiment Virtual Research Environment for social sharing of workflows. Future Generation Computer Systems 25 (5): 561--567. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. K. Belhajjame, O. Corcho, D. Garijo, J. Zhao et al. 2012. Workflow-Centric Research Objects: A First Class Citizen in the Scholarly Discourse. In ESWC2012 Workshop on the Future of Scholarly Communication in the Semantic Web (SePublica 2012), Heraklion, Greece, May 2012.Google ScholarGoogle Scholar
  10. D. De Roure, S. Bechhofer, C. Goble and D. Newman. 2011. Scientific Social Objects: The Social Objects and Multidimensional Network of the myExperiment Website. In 1st International Workshop on Social Object Networks (SocialObjects 2011), Boston, MA, US.Google ScholarGoogle Scholar
  11. D. De Roure. 2010. Replacing the Paper: The Twelve Rs of the e-Research Record. e-Research, Nature blogs. http://www.scilogs.com/eresearch/replacing-the-paper-the-twelve-rs-of-the-e-research-record/Google ScholarGoogle Scholar
  12. D. De Roure. 2013. Towards computational research objects. In Proceedings of the 1st International Workshop on Digital Preservation of Research Methods and Artefacts (DPRMA '13) co-located at Joint Conference on Digital Libraries 2013, Indianapolis, IN, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J.S. Downie, A.F. Ehmann, M. Bay and M.C. Jones. 2010. The Music Information Retrieval Evaluation eXchange: Some Observations and Insights. Advances in Music Information Retrieval Vol. 274, pp. 93-115.Google ScholarGoogle ScholarCross RefCross Ref
  14. K.R. Page, B. Fields, D. De Roure, T. Crawford, J.S. Downie. 2013. Capturing the workflows of music information retrieval for repeatability and reuse. J. Intell. Inf. Syst. 41(3): 435-459. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Executable Music Documents

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          DLfM '14: Proceedings of the 1st International Workshop on Digital Libraries for Musicology
          September 2014
          102 pages
          ISBN:9781450330022
          DOI:10.1145/2660168

          Copyright © 2014 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 12 September 2014

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited

          Acceptance Rates

          Overall Acceptance Rate27of48submissions,56%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader