ABSTRACT
The development of information visualizations for companies poses specific challenges, especially for evaluation processes. It is advisable to test these visualizations under realistic circumstances. Because of various constraints, this can be quite difficult. In this paper, we discuss three different methods which can be used to conduct evaluations in companies. These methods are appropriate for different stages in the software life cycle (design phase, development, deployment) and reflect an iterative approach in evaluation. Based on an overview of available evaluation methods we argue that this combination of fairly lightweight methods is especially appropriate for evaluations of information visualizations in companies. These methods complement each other and emphasize different aspects of the evaluation. Based on this case study, we try to generalize our lessons learned from our experiences of conducting evaluations in this context.
- S. Carpendale. Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, pages 19--45. Springer, 2008. Google ScholarDigital Library
- C. Courage and K. Baxter. Understanding Your Users: A Practical Guide to User Requirements Methods, Tools, and Techniques. Morgan Kaufmann Publishers Inc., 2004. Google ScholarDigital Library
- W. Dou, D. H. Jeong, F. Stukes, W. Ribarsky, H. R. Lipford, and R. Chang. Recovering reasoning processes from user interactions. IEEE Comput. Graph. Appl., 29(3):52--61, 2009. Google ScholarDigital Library
- F. Fischer, J. Davey, J. Fuchs, O. Thonnard, J. Kohlhammer, and D. A. Keim. A visual analytics field experiment to evaluate alternative visualizations for cyber security applications. In Proc. of the EuroVA International Workshop on Visual Analytics, 2014.Google Scholar
- C. M. Freitas, M. S. Pimenta, and D. L. Scapin. User-centered evaluation of information visualization techniques: Making the hci-infovis connection explicit. In W. Huang, editor, Handbook of Human Centric Visualization, pages 315--336. Springer, 2014.Google ScholarCross Ref
- W. O. Galitz. The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques. John Wiley & Sons, 2007. Google ScholarDigital Library
- D. Gotz and M. X. Zhou. Characterizing users' visual analytic activity for insight provenance. Information Visualization, 8(1):42--55, 2009. Google ScholarDigital Library
- T. Gschwandtner, W. Aigner, S. Miksch, S. Kriglstein, M. Pohl, N. Suchy, and J. Gaertner. TimeCleanser: A visual analytics approach for data cleansing of time-oriented data. In Proc. of the 14th Int. Conf. on Knowledge Management and Knowledge Technologies. ACM, 2014.Google ScholarDigital Library
- T. Gschwandtner, J. Gärtner, W. Aigner, and S. Miksch. A taxonomy of dirty time-oriented data. In G. Quirchmayr, J. Basl, I. You, L. Xu, and E. Weippl, editors, Multidisciplinary Research and Practice for Information Systems, pages 58--72. Springer, 2012.Google ScholarCross Ref
- R. Hartson and P. A. Pyla. The UX Book: Process and Guidelines for Ensuring a Quality User Experience. Morgan Kaufmann, 2012. Google ScholarDigital Library
- T. Isenberg, P. Isenberg, J. Chen, M. Sedlmair, and T. Möller. A systematic review on the practice of evaluating visualization. IEEE Transactions on Visualization and Computer Graphics, 19(12):2818--2827, 2013. Google ScholarDigital Library
- M. Y. Ivory and M. A. Hearst. The state of the art in automating usability evaluation of user interfaces. ACM Comput. Surv., 33(4):470--516, 2001. Google ScholarDigital Library
- S. Kandel, R. Parikh, A. Paepcke, J. Hellerstein, and J. Heer. Profiler: Integrated statistical analysis and visualization for data quality assessment. In Proc. of the Int. Working Conf. on Advanced Visual Interfaces, pages 547--554, 2012. Google ScholarDigital Library
- J. Kitzinger. Qualitative research: Introducing focus groups. BMJ, 311(7000):299--302, 1995.Google ScholarCross Ref
- S. Kriglstein and G. Wallner. Human centered design in practice: A case study with the ontology visualization tool Knoocks. In G. Csurka, M. Kraus, L. Mestetskiy, P. Richard, and J. Braz, editors, Computer Vision, Imaging and Computer Graphics. Theory and Applications, pages 123--141. Springer, 2013.Google ScholarCross Ref
- H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical studies in information visualization: Seven scenarios. IEEE Transactions on Visualization and Computer Graphics, 18(9):1520--1536, 2012. Google ScholarDigital Library
- J. Lazar, J. H. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley Publishing, 2010. Google ScholarDigital Library
- J. Nielsen. Discount usability: 20 years. http://www.nngroup.com/articles/discount-usability-20-years/. Accessed: June, 2014.Google Scholar
- M. Pohl, S. Wiltner, S. Miksch, W. Aigner, and A. Rind. Analysing interactivity in information visualisation. KI - Künstliche Intelligenz, 26(2):151--159, 2012.Google Scholar
- R. A. Powell and H. M. Single. Focus groups. International Journal for Quality in Health Care, 8(5):499--504, 1996.Google ScholarCross Ref
- V. Raman and J. M. Hellerstein. Potter's wheel: An interactive data cleaning system. In Proc. of the 27th Int. Conf. on Very Large Data Bases, pages 381--390, 2001. Google ScholarDigital Library
- Random Developers. OpenRefine. http://openrefine.org/. Accessed: June, 2014.Google Scholar
- J. Rubin and D. Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley Publ., 2008. Google ScholarDigital Library
- M. Sedlmair, P. Isenberg, D. Baur, and A. Butz. Evaluating information visualization in large companies: Challenges, experiences and recommendations. In Proc. of the 3rd BELIV'10 Workshop: BEyond Time and Errors: Novel evaLuation Methods for Information Visualization, BELIV '10, pages 79--86. ACM, 2010. Google ScholarDigital Library
- Y. B. Shrinivasan and J. J. van Wijk. Supporting exploration awareness in information visualization. IEEE Comput. Graph. Appl., 29(5):34--43, 2009. Google ScholarDigital Library
- C. Snyder. Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces. Morgan Kaufmann, 2003.Google ScholarDigital Library
- Talend. Profiler. http://www.talend.com/. Accessed: June, 2014.Google Scholar
- M. Tory. User studies in visualization: A reflection on methods. In W. Huang, editor, Handbook of Human Centric Visualization, pages 411--426. Springer, 2014.Google ScholarCross Ref
- C. Ware. Information Visualization: Perception for Design. Morgan Kaufmann Publishers Inc., 2004. Google ScholarDigital Library
- C. Wilson. User Experience Re-Mastered: Your Guide to Getting the Right Design. Morgan Kaufmann Publishers/Elsevier, 2010. Google ScholarDigital Library
- XIMES GmbH. Time Intelligence Solutions {TIS}. www.ximes.com/en/software/products/tis/. Accessed: June, 2014.Google Scholar
- Y. Zheng, L. Zhang, X. Xie, and W.-Y. Ma. Mining interesting locations and travel sequences from GPS trajectories. In Proc. of the 18th Int. Conf. on World Wide Web, pages 791--800. ACM, 2009. Google ScholarDigital Library
Index Terms
- Experiences and challenges with evaluation methods in practice: a case study
Recommendations
How can agile and documentation-driven methods be meshed in practice?
Agile Processes in Software Engineering and Extreme ProgrammingAgile methods are becoming increasingly popular in software development; even by organizations complying with quality standards. The literature reports on scattered examples of organizations that have succeeded in meshing agile and documentation-driven ...
Obstacles to decision making in Agile software development teams
Highlights We conducted focus group with 43 Agile developers and managers. We conducted 6 case studies with 33 interviews across 5 organizations. Results indicate 6 decision obstacles Agile software development teams face. Decision obstacles were mapped ...
A framework to support the evaluation, adoption and improvement of agile methods in practice
Agile methods are often seen as providing ways to avoid overheads typically perceived as being imposed by traditional software development environments. However, few organizations are psychologically or technically able to take on an agile approach ...
Comments