skip to main content
research-article

Comparing complexity of API designs: an exploratory experiment on DSL-based framework integration

Published:22 October 2011Publication History
Skip Abstract Section

Abstract

Embedded, textual DSLs are often provided as an API wrapped around object-oriented application frameworks to ease framework integration. While literature presents claims that DSL-based application development is beneficial, empirical evidence for this is rare. We present the results of an experiment comparing the complexity of three different object-oriented framework APIs and an embedded, textual DSL. For this comparative experiment, we implemented the same, non-trivial application scenario using these four different APIs. Then, we performed an Object-Points (OP) analysis, yielding indicators for the API complexity specific to each API variant. The main observation for our experiment is that the embedded, textual DSL incurs the smallest API complexity. Although the results are exploratory, as well as limited to the given application scenario and a single embedded DSL, our findings can direct future empirical work. The experiment design is applicable for similar API design evaluations.

References

  1. R. K. Bandi, V. K. Vaishnavi, and D. E. Turk. Predicting Maintenance Performance Using Object-Oriented Design Complexity Metrics. IEEE Transactions on Software Engineering, 29: 77--87, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. D. Banker, R. J. Kauffman, and R. Kumar. An Empirical Test of Object-Based Output Measurement Metrics in a Computer Aided Software Engineering (CASE) Environment. Journal of Management Information Systems, 8 (3): 127--150, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. D. Batory, C. Johnson, B. MacDonald, and D. von Heeder. Achieving Extensibility through Product-Lines and Domain-Specific Languages: A Case Study. ACM Trans. Softw. Eng. Methodol., 11 (2): 191--214, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. Bettin. Measuring the Potential of Domain-Specific Modelling Techniques. In Proceedings of the 2nd Domain-Specific Modelling Languages Workshop (OOPSLA), Seattle, Washington, USA, pages 39--44, 2002.Google ScholarGoogle Scholar
  5. J. Capers. Applied Software Measurement: Global Analysis of Productivity and Quality. McGraw-Hill, 3rd edition, 2008.Google ScholarGoogle Scholar
  6. S. Clarke and C. Becker. Using the Cognitive Dimensions Framework to evaluate the usability of a class library. In Proceedings of the 15h Workshop of the Psychology of Programming Interest Group (PPIG 2003), Keele, UK, pages 359--336, 2003.Google ScholarGoogle Scholar
  7. M. Fowler. Domain Specific Languages. The Addison-Wesley Signature Series. Addison-Wesley Professional, 1st edition, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. P. Gabriel, M. Goulão, and V. Amaral. Do Software Languages Engineers Evaluate their Languages? In Proceedings of the XIII Congreso Iberoamericano en "Software Engineering", 2010.Google ScholarGoogle Scholar
  9. F. Hermans, M. Pinzger, and A. van Deursen. Domain-Specific Languages in Practice: A User Study on the Success Factors. In Proceedings of the 12th International Conference Model Driven Engineering Languages and Systems (MODELS 2009) Denver, CO, USA, October 4-9, 2009, volume 5795 of Lecture Notes in Computer Science, pages 423--437. Springer, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. P. Klint, T. van der Storm, and J. Vinju. On the Impact of DSL Tools on the Maintainability of Language Implementations. In C. Brabrand and P.-E. Moreau, editors, Proceedings of Workshop on Language Descriptions, Tools and Applications 2010 (LDTA'10), pages 10:1--10:9. ACM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. T. Kosar, P. M. López, P. Barrientos, and M. Mernik. A preliminary study on various implementation approaches of domain-specific languages. Information and Software Technology, 50 (5): 390--405, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Merilinna and J. Pärssinen. Comparison Between Different Abstraction Level Programming: Experiment Definition and Initial Results. In Proceedings of the 7th OOPSLA Workshop on Domain-Specific Modeling (DSM'07), Montréal, Candada, number TR-38 in Technical Report, Finland, 2007. University of Jyväskylä.Google ScholarGoogle Scholar
  13. M. Mernik, J. Heering, and A. Sloane. When and How to Develop Domain-Specific Languages. ACM Computing Surveys, 37 (4): 316--344, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. MetaCase. Nokia Case Study. Industry experience report, MetaCase, 2007.Google ScholarGoogle Scholar
  15. G. Neumann and U. Zdun. XOTcl, an Object-Oriented Scripting Language. In Proceedings of Tcl2k: The 7th USENIX Tcl/Tk Conference, Austin, Texas, USA, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. OASIS. Security Assertion Markup Language (SAML) V2.0 Technical Overview. http://docs.oasis-open.org/security/saml/Post2.0/sstc-saml-tech-overview-2.0-cd-02.pdf, 2008.Google ScholarGoogle Scholar
  17. H. M. Sneed. Estimating the costs of software maintenance tasks. In Proceedings of the International Conference on Software Maintenance (ICSM'95), Opio (Nice), France, October 17-20, 1995, pages 168--181. IEEE Computer Society, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. Strembeck and U. Zdun. An Approach for the Systematic Development of Domain-Specific Languages. Software: Practice and Experience, 39 (15): 1253--1292, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. Stylos and B. A. Myers. Mapping the Space of API Design Decisions. In 2007 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC 2007), 23-27 September 2007, Coeur d'Alene, Idaho, USA, pages 50--60. IEEE Computer Society, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. A. van Deursen and P. Klint. Little languages: Little Maintenance? Journal of Software Maintenance, 10 (2): 75--92, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. D. A. Wheeler. SLOCCount. http://www.dwheeler.com/sloccount/, last accessed: October 14, 2008.Google ScholarGoogle Scholar
  22. D. Wile. Lessons learned from real DSL experiments. Science of Computer Programming, 51 (3): 265--290, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. J. Wüst. SDMetrics. http://sdmetrics.com/, last accessed: May 27, 2011, 2011.Google ScholarGoogle Scholar
  24. J. Zeng, C. Mitchell, and S. A. Edwards. A Domain-Specific Language for Generating Dataflow Analyzers. Electronic Notes in Theoretical Computer Science, 164 (2): 103--119, 2006.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Comparing complexity of API designs: an exploratory experiment on DSL-based framework integration

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          • Published in

            cover image ACM SIGPLAN Notices
            ACM SIGPLAN Notices  Volume 47, Issue 3
            GCPE '11
            March 2012
            179 pages
            ISSN:0362-1340
            EISSN:1558-1160
            DOI:10.1145/2189751
            Issue’s Table of Contents
            • cover image ACM Conferences
              GPCE '11: Proceedings of the 10th ACM international conference on Generative programming and component engineering
              October 2011
              194 pages
              ISBN:9781450306898
              DOI:10.1145/2047862

            Copyright © 2011 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 22 October 2011

            Check for updates

            Qualifiers

            • research-article

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader