skip to main content
10.1145/1357054.1357287acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Metrics for measuring human interaction with interactive visualizations for information analysis

Authors Info & Claims
Published:06 April 2008Publication History

ABSTRACT

There is a lack of widely-accepted metrics for evaluating analysts' experiences with interactive visualizations (IV) for information analysis. We report an approach for developing analyst-centered IV metrics that is built upon understanding the workplace needs and experiences of information analysts with respect to IVs. We derive metrics from human-computer interaction heuristics, specializing the metrics to address the characteristics of IVs and analysts. When there are no existing heuristics, analysts' needs and experiences inform new heuristics.

References

  1. Dzida, W., Herda, S. and Itzfeldt , W. D. User-perceived quality of interactive systems. IEEE Transactions on Software Engineering, vol. SE-4(4), (July, 1978), pp. 270--276. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Engelbart, D. Augmenting human intellect: A conceptual framework. Summary Report AFOSR-3233. Stanford Research Institute: Menlo Park. Retrieved from http://www.bootstrap.org/augdocs/friedewald030402/augmentinghumanintellect/ahi62index.html July 20, 2007.Google ScholarGoogle Scholar
  3. Gerhardt-Powals, J. (1996). Cognitive engineering principles for enhancing human-computer performance. International Journal of Human-Computer Interaction, 8(2), 189--211. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Gould, J.D. and Lewis, C. Designing for usability - key principles and what designers think. In Proc. CHI 1983, ACM Press (1983), 50--53. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. International Organization for Standardization (ISO). 1998. Ergonomic requirements for office work with visual display terminals (BDTs) - Part 11: Guidance on usability.Google ScholarGoogle Scholar
  6. Jonas-Dwyer, D. and Pospisil, R. The millennial effect: Implications for academic development. In Proc. HERDSA 2004, 194--207.Google ScholarGoogle Scholar
  7. Koyani, et al. Research-Based Web Design & Usability Guidelines: Current Research-Based Guidelines on Web Design and Usability Issues. U.S. Department of Health and Human Services, Washington, D.C., 2006. (http://www.usability.gov/pdfs/guidelines.html) Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Lin, H.X., Choong, Y. and Salvendy, G., 1997. A proposed index of usability: a method for comparing the relative usability of different software systems. Behaviour and Information Technology 16 (4-5), 267--278.Google ScholarGoogle ScholarCross RefCross Ref
  9. Marcus, A. Graphical User Interfaces. In Handbook of Human-Computer Interaction, Chapter 19, eds., Helander, M., Landauer, T.K., and Prabhu, P. Elsevier Science B.V., Amsterdam, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  10. Smith, S. L. and Mosier, J. N. 1986, Design Guidelines for Designing User Interface Software. Technical Report MTR-10090 (The MITRE Corporation, Bedford, MA).Google ScholarGoogle Scholar
  11. Thomas, J.J. and Cook, K.A., eds. 2005. Illuminating the path. Los Alamitos, CA: IEEE.Google ScholarGoogle Scholar
  12. Zuk, T., Schlesier L., Neumann P., Hancock M.S. and Carpendale S. Heuristics for information visualization evaluation. In Proc. BELIV'06, 55--60. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Metrics for measuring human interaction with interactive visualizations for information analysis

    Recommendations

    Reviews

    Mariana Damova

    The authors' research objective is to produce a set of heuristically derived metrics that are sensitive to measuring human interaction with interactive visualizations (IVs). This paper introduces a new approach to measuring IVs using human-computer interaction (HCI) heuristics. It describes a metric system that targets specific groups of information analysts, and is built on an understanding of their workplace needs and experiences. The characteristics of young analysts, as opposed to senior analysts, are captured in the developed metrics. Young analysts' skill sets, the authors argue, are related to the playing of computer games, high computer literacy, collaboration, and multiprocessing, as well as the ability to operate in complicated on-screen visualization environments. In the proposed approach, metrics are derived from HCI heuristics: research-based usability guidelines applied when designing and evaluating human interaction with technology. Established general HCI heuristics are adapted to develop new analyst-centered heuristics that pertain to measurable desirable IV interface characteristics; then, these new heuristics are turned into metrics. The selected new heuristics include the logging of interaction with targeted functionality and screen elements, and analysts filling in surveys about different aspects and stages of their experience. These heuristics are intended to measure the level of comfort in using an IV designed with emerging technologies, such as three dimensions (3D), animations, virtual environments, offering different views of the same data, and promoting rapid coordinated investigation and exploration. The values derived from the metrics are: first, empowering analysis, addressing analysts needs for improved quality of analysis, and counting the number of networks identified, hypotheses generated, and inferences made; second, improving analytic products, addressing heuristics for facilitating reporting, assembling interim work products into reports, and rating the quality of the analysts' work; third, collaboration, addressing problems that are best solved by a team, comparing the work of novice and senior analysts, and counting the number of correct answers given by individuals and by teams; fourth, ease of use, addressing the high information density of IV interfaces and measuring analysts' disorientation at the onset, and counting the amount of time spent manipulating information elements; fifth, immediate feedback, addressing the need for quick feedback (as in computer game culture), and monitoring the length of time spent gazing at a screen area, where a change in IV is expected; sixth, tracking errors and critical incidents, measuring system errors, recording reasons for analysts' work stoppage, such as dissatisfaction, disorientation, confusion, frustration, and boredom; and seventh, supporting minimal actions, promoting analysts' efficiency, looking into minimal eye movements, and counting fixations and number of clicks (for example, the amount of effort required to perform a task). This is a well-organized descriptive work with no reported results. The authors acknowledge many facets of an analyst's interaction with IVs, including a long list of intended future research aimed at obtaining the right set of metrics. There is no illustrative material at all. This paper offers a glance at an interesting and important topic of creating productivity tools, while leaving a lot to the individual reader's imagination. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2008
      1870 pages
      ISBN:9781605580111
      DOI:10.1145/1357054

      Copyright © 2008 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 6 April 2008

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '08 Paper Acceptance Rate157of714submissions,22%Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader