skip to main content
10.1145/3173574.3174106acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Public Access

Augmenting Code with In Situ Visualizations to Aid Program Understanding

Published:21 April 2018Publication History

ABSTRACT

Programmers must draw explicit connections between their code and runtime state to properly assess the correctness of their programs. However, debugging tools often decouple the program state from the source code and require explicitly invoked views to bridge the rift between program editing and program understanding. To unobtrusively reveal runtime behavior during both normal execution and debugging, we contribute techniques for visualizing program variables directly within the source code. We describe a design space and placement criteria for embedded visualizations. We evaluate our in situ visualizations in an editor for the Vega visualization grammar. Compared to a baseline development environment, novice Vega users improve their overall task grade by about 2 points when using the in situ visualizations and exhibit significant positive effects on their self-reported speed and accuracy.

Skip Supplemental Material Section

Supplemental Material

pn4254-file3.mp4

mp4

20.4 MB

pn4254-file5.mp4

mp4

3 MB

References

  1. Fabian Beck, Fabrice Hollerich, Stephan Diehl, and Daniel Weiskopf. 2013a. Visual monitoring of numeric variables embedded in source code. In Software Visualization (VISSOFT), 2013 First IEEE Working Conference on. IEEE, 1--4.Google ScholarGoogle ScholarCross RefCross Ref
  2. Fabian Beck, Oliver Moseler, Stephan Diehl, and Günter Daniel Rey. 2013b. In situ understanding of performance bottlenecks through visually augmented code. In 2013 21st International Conference on Program Comprehension (ICPC). IEEE, 63--72.Google ScholarGoogle ScholarCross RefCross Ref
  3. Richard A Becker and William S Cleveland. 1987. Brushing scatterplots. Technometrics 29, 2 (1987), 127--142. Google ScholarGoogle ScholarCross RefCross Ref
  4. Mike Bostock. 2014. Visualizing Algorithms. https://bost.ocks.org/mike/algorithms/. (2014).Google ScholarGoogle Scholar
  5. Brian Burg, Richard Bailey, Andrew J Ko, and Michael D Ernst. 2013. Interactive record/replay for web application debugging. In Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 473--484. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Bay-Wei Chang, Jock D Mackinlay, Polle T Zellweger, and Takeo Igarashi. 1998. A negotiation architecture for fluid documents. In Proceedings of the 11th annual ACM symposium on User interface software and technology. ACM, 123--132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Evan Czaplicki and Stephen Chong. 2013. Asynchronous functional reactive programming for GUIs. In Proc. ACM SIGPLAN. ACM, 411--422. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Camil Demetrescu, Irene Finocchi, and John T Stasko. 2002. Specifying Algorithm Visualizations: Interesting Events or State Mapping? In Software Visualization. Springer, 16--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Pascal Goffin, Jeremy Boy, Wesley Willett, and Petra Isenberg. 2016. An Exploratory Study of Word-Scale Graphics in Data-Rich Text Documents. IEEE Transactions on Visualization and Computer Graphics 99 (2016), 1.Google ScholarGoogle Scholar
  10. Pascal Goffin, Wesley Willett, Jean-Daniel Fekete, and Petra Isenberg. 2014. Exploring the placement and design of word-scale visualizations. IEEE Transactions on Visualization and Computer Graphics 20, 12 (2014), 2291--2300.Google ScholarGoogle ScholarCross RefCross Ref
  11. Pascal Goffin, Wesley Willett, Jean-Daniel Fekete, and Petra Isenberg. 2015. Design considerations for enhancing word-scale visualizations with interaction. In Posters of the Conference on Information Visualization (InfoVis).Google ScholarGoogle Scholar
  12. Google. 2018. JavaScript Debugging Reference. https://developers.google.com/web/tools/ chrome-devtools/javascript/reference. (January 2018).Google ScholarGoogle Scholar
  13. Scott Grissom, Myles F McNally, and Tom Naps. 2003. Algorithm visualization in CS education: comparing levels of student engagement. In Proceedings of the 2003 ACM symposium on Software visualization. ACM, 87--94. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Philip J. Guo. 2013. Online Python Tutor: Embeddable Web-based Program Visualization for CS Education. In Proceedings of the 44th ACM Technical Symposium on Computer Science Education (SIGCSE '13). ACM, New York, NY, USA, 579--584. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Matthew Harward, Warwick Irwin, and Neville Churcher. 2010. In situ software visualisation. In Software Engineering Conference (ASWEC), 2010 21st Australian. IEEE, 171--180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Christopher Healey and James Enns. 2012. Attention and visual memory in visualization and computer graphics. IEEE transactions on visualization and computer graphics 18, 7 (2012), 1170--1188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Jeffrey Heer, Nicholas Kong, and Maneesh Agrawala. 2009. Sizing the horizon: the effects of chart size and layering on the graphical perception of time series visualizations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1303--1312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jane Hoffswell, Arvind Satyanarayan, and Jeffrey Heer. 2016. Visual Debugging Techniques for Reactive Data Visualization. Computer Graphics Forum (Proc. EuroVis) (2016). http://idl.cs.washington.edu/papers/vega-debuggingGoogle ScholarGoogle Scholar
  19. Facebook Inc. 2017. React: A JavaScript Library for Building User Interfaces. https://reactjs.org/. (2017).Google ScholarGoogle Scholar
  20. Andrew J Ko and Brad A Myers. 2004. Designing the whyline: a debugging interface for asking questions about program behavior. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 151--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Andrew J Ko and Brad A Myers. 2005. A framework and methodology for studying the causes of software errors in programming systems. Journal of Visual Languages&Computing 16, 1 (2005), 41--84. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Andrew J Ko, Brad A Myers, Michael J Coblenz, and Htet Htet Aung. 2006. An exploratory study of how developers seek, relate, and collect relevant information during software maintenance tasks. IEEE Transactions on software engineering 32, 12 (2006). Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Tom Lieber, Joel R Brandt, and Rob C Miller. 2014. Addressing misconceptions about code with always-on programming visualizations. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2481--2490. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Jock Mackinlay. 1986. Automating the design of graphical presentations of relational information. Acm Transactions On Graphics (Tog) 5, 2 (1986), 110--141. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D Scott McCrickard and Christa M Chewar. 2003. Attuning notification design to user goals and attention costs. Commun. ACM 46, 3 (2003), 67--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Microsoft. 2017. Monaco Editor. https://microsoft.github.io/monaco-editor/index.html. (2017).Google ScholarGoogle Scholar
  27. Jeanne Nakamura and Mihaly Csikszentmihalyi. 2014. The concept of flow. In Flow and the foundations of positive psychology. Springer, 239--263.Google ScholarGoogle Scholar
  28. Stephen Oney and Brad Myers. 2009. FireCrystal: Understanding interactive behaviors in dynamic web pages. In 2009 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC). IEEE, 105--108. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Chris Parnin and Spencer Rugaber. 2011. Resumption strategies for interrupted programming tasks. Software Quality Journal 19, 1 (2011), 5--34. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Zening Qu and Jessica Hullman. 2018. Keeping Multiple Views Consistent: Constraints, Validations, and Exceptions in Visualization Authoring. IEEE Trans. Visualization&Comp. Graphics (Proc. InfoVis) (2018). http://idl.cs.washington.edu/papers/consistencyGoogle ScholarGoogle Scholar
  31. David Saff and Michael D Ernst. 2003. Reducing wasted development time via continuous testing. In Software Reliability Engineering, 2003. ISSRE 2003. 14th International Symposium on. IEEE, 281--292. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Arvind Satyanarayan, Ryan Russell, Jane Hoffswell, and Jeffrey Heer. 2015. Reactive Vega: A Streaming Dataflow Architecture for Declarative Interactive Visualization. IEEE Trans. Visualization&Comp. Graphics (Proc. InfoVis) (2015). http://idl.cs.washington.edu/papers/ reactive-vega-architectureGoogle ScholarGoogle Scholar
  33. Clifford A Shaffer, Matthew Cooper, and Stephen H Edwards. 2007. Algorithm visualization: a report on the state of the field. In ACM SIGCSE Bulletin, Vol. 39. ACM, 150--154. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Ben Swift, Andrew Sorensen, Henry Gardner, and John Hosking. 2013. Visual code annotations for cyberphysical programming. In Proceedings of the 1st International Workshop on Live Programming. IEEE Press, 27--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Edward R Tufte. 2006. Beautiful evidence. New York (2006). Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Bret Victor. 2012. Learnable programming: Designing a programming system for understanding programs. http://worrydream.com/LearnableProgramming. (2012).Google ScholarGoogle Scholar
  37. Wesley Willett, Jeffrey Heer, and Maneesh Agrawala. 2007. Scented widgets: Improving navigation cues with embedded visualizations. IEEE Transactions on Visualization and Computer Graphics 13, 6 (2007), 1129--1136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Polle T Zellweger, Susan Harkness Regli, Jock D Mackinlay, and Bay-Wei Chang. 2000. The impact of fluid documents on reading and browsing: An observational study. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 249--256. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Augmenting Code with In Situ Visualizations to Aid Program Understanding

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
        April 2018
        8489 pages
        ISBN:9781450356206
        DOI:10.1145/3173574

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 21 April 2018

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        CHI '18 Paper Acceptance Rate666of2,590submissions,26%Overall Acceptance Rate6,199of26,314submissions,24%

        Upcoming Conference

        CHI '24
        CHI Conference on Human Factors in Computing Systems
        May 11 - 16, 2024
        Honolulu , HI , USA

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader