skip to main content
research-article
Public Access

Low-overhead and fully automated statistical debugging with abstraction refinement

Published:19 October 2016Publication History
Skip Abstract Section

Abstract

Cooperative statistical debugging is an effective approach for diagnosing production-run failures. To quickly identify failure predictors from the huge program predicate space, existing techniques rely on random or heuristics-guided predicate sampling at the user side. However, none of them can satisfy the requirements of low cost, low diagnosis latency, and high diagnosis quality simultaneously, which are all indispensable for statistical debugging to be practical.

This paper presents a new technique that tackles the above challenges. We formulate the technique as an instance of abstraction refinement, where efficient abstract-level profiling is first applied to the whole program and its execution brings information that can pinpoint suspicious coarse-grained entities that need to be refined. The refinement profiles a corresponding set of fine-grained entities, and generates feedback that determines what to prune and what to refine next. The process is fully automated, and more importantly, guided by a mathematically rigorous analysis that guarantees that our approach produces the same debugging results as an exhaustive analysis in deterministic settings.

We have implemented this technique for both C and Java on both single machine and distributed system. A thorough evaluation demonstrates that our approach yields (1) an order of magnitude reduction in the user-side runtime overhead even compared to a sampling-based approach and (2) two orders of magnitude reduction in the size of data transferred over the network, completely automatically without sacrificing any debugging capability.

References

  1. https://drive.google.com/file/d/ 0B58Jj9Us3ouQU2lvTm1XRzhYbm8/view?usp=sharing.Google ScholarGoogle Scholar
  2. R. Abreu, P. Zoeteweij, and A. J. C. van Gemund. On the accuracy of spectrum-based fault localization. In Proceedings of the Testing: Academic and Industrial Conference Practice and Research Techniques - MUTATION, TAICPART-MUTATION ’07, pages 89–98, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. R. Abreu, P. Zoeteweij, and A. J. C. v. Gemund. Spectrumbased multiple fault localization. In ASE, pages 88–99, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. J. Arulraj, P.-C. Chang, G. Jin, and S. Lu. Production-run software failure diagnosis via hardware performance counters. In ASPLOS, pages 101–112, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. Arulraj, G. Jin, and S. Lu. Leveraging the short-term memory of hardware to diagnose production-run software failures. In ASPLOS, pages 207–222, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. P. Arumuga Nainar. Applications of Static Analysis and Program Structure in Statistical Debugging. PhD thesis, University of Wisconsin – Madison, Aug. 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. P. Arumuga Nainar and B. Liblit. Adaptive bug isolation. In ICSE, pages 255–264, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. D. Bond, K. E. Coons, and K. S. McKinley. PACER: Proportional detection of data races. In PLDI, pages 255–268, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. H. Cheng, D. Lo, Y. Zhou, X. Wang, and X. Yan. Identifying bug signatures using discriminative graph mining. In ISSTA, pages 141–152, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. T. M. Chilimbi, B. Liblit, K. Mehra, A. V. Nori, and K. Vaswani. HOLMES: Effective statistical debugging via efficient path profiling. In ICSE, pages 34–44, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. E. M. Clarke, O. Grumberg, S. Jha, Y. Lu, and H. Veith. Counterexample-guided abstraction refinement. In CAV, pages 154–169, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. Clause and A. Orso. A technique for enabling and supporting debugging of field failures. In ICSE, pages 261–270, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. H. Cleve and A. Zeller. Locating causes of program failures. In ICSE, pages 342–351, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. H. Do, S. G. Elbaum, and G. Rothermel. Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Empirical Software Engineering: An International Journal, 10(4):405–435, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. K. Glerum, K. Kinshumann, S. Greenberg, G. Aul, V. Orgovan, G. Nichols, D. Grant, G. Loihle, and G. C. Hunt. Debugging in the (very) large: ten years of implementation and experience. In SOSP, pages 103–116, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. N. Gupta, H. He, X. Zhang, and R. Gupta. Locating faulty code using failure-inducing chops. In ASE, pages 263–272, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. H.-Y. Hsu, J. A. Jones, and A. Orso. Rapid: Identifying bug signatures to support debugging activities. In ASE, pages 439– 442, 2008.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. L. Jiang and Z. Su. Context-aware statistical debugging: from bug predictors to faulty control flow paths. In ASE, pages 184–193, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. G. Jin, A. Thakur, B. Liblit, and S. Lu. Instrumentation and sampling strategies for cooperative concurrency bug isolation. In OOPSLA, pages 241–255, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. J. A. Jones and M. J. Harrold. Empirical evaluation of the tarantula automatic fault-localization technique. In ASE, pages 273–282, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. J. A. Jones, M. J. Harrold, and J. Stasko. Visualization of test information to assist fault localization. In ICSE, pages 467– 477, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. B. Kasikci, C. Zamfir, and G. Candea. RaceMob: Crowdsourced data race detection. In SOSP, pages 406–422, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. B. Liblit, A. Aiken, A. X. Zheng, and M. I. Jordan. Bug isolation via remote program sampling. In PLDI, pages 141– 154, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. B. Liblit, M. Naik, A. X. Zheng, A. Aiken, and M. I. Jordan. Scalable statistical bug isolation. In PLDI, pages 15–26, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. B. R. Liblit. Cooperative Bug Isolation. PhD thesis, University of California, Berkeley, Dec. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C. Liu, X. Yan, L. Fei, J. Han, and S. P. Midkiff. SOBER: statistical model-based bug localization. In FSE, pages 286– 295, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. B. Lucia and L. Ceze. Cooperative empirical failure avoidance for multithreaded programs. In ASPLOS, pages 39–50, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. J. R. Lyle and W. M. Automatic program bug location by program slicing. In Proceedings of the 2nd International Conference on Computer and Applications, pages 877–883, 1987.Google ScholarGoogle Scholar
  29. I. Neamtiu and M. Hicks. Safe and timely updates to multithreaded programs. In PLDI, pages 13–24, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. S. Park, R. W. Vuduc, and M. J. Harrold. Falcon: Fault localization in concurrent programs. In ICSE, pages 245–254, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. S. Park, R. W. Vuduc, and M. J. Harrold. Unicorn: a unified approach for localizing non-deadlock concurrency bugs. Software Testing, Verification and Reliability, 25(3):167–190, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. C. Parnin and A. Orso. Are automated debugging techniques actually helping programmers? In ISSTA, pages 199–209, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. E. Renieris. A research framework for software-fault localization tools. PhD thesis, Providence, RI, USA, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. M. Renieris and S. P. Reiss. Fault localization with nearest neighbor queries. In ASE, pages 30–39, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  35. L. Song and S. Lu. Statistical debugging for real-world performance problems. In OOPSLA, pages 561–578, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. S. Subramanian, M. Hicks, and K. S. McKinley. Dynamic software updates: A vm-centric approach. In PLDI, pages 1– 12, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. C. Sun and S.-C. Khoo. Mining succinct predicated bug signatures. In FSE, pages 576–586, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. M. Weiser. Programmers use slices when debugging. Commun. ACM, 25(7):446–452, July 1982. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. A. Zeller. Yesterday, my program worked. today, it does not. why? In FSE, pages 253–267, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. A. Zeller and R. Hildebrandt. Simplifying and isolating failure-inducing input. IEEE Trans. Software Eng., 28(2): 183–200, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. X. Zhang, H. He, N. Gupta, and R. Gupta. Experimental evaluation of using dynamic slices for fault location. In AADEBUG, pages 33–42, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. X. Zhang, N. Gupta, and R. Gupta. Locating faults through automated predicate switching. In ICSE, pages 272–281, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Z. Zuo, S.-C. Khoo, and C. Sun. Efficient predicated bug signature mining via hierarchical instrumentation. In ISSTA, pages 215–224, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Low-overhead and fully automated statistical debugging with abstraction refinement

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in

              Full Access

              • Published in

                cover image ACM SIGPLAN Notices
                ACM SIGPLAN Notices  Volume 51, Issue 10
                OOPSLA '16
                October 2016
                915 pages
                ISSN:0362-1340
                EISSN:1558-1160
                DOI:10.1145/3022671
                Issue’s Table of Contents
                • cover image ACM Conferences
                  OOPSLA 2016: Proceedings of the 2016 ACM SIGPLAN International Conference on Object-Oriented Programming, Systems, Languages, and Applications
                  October 2016
                  915 pages
                  ISBN:9781450344449
                  DOI:10.1145/2983990

                Copyright © 2016 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 19 October 2016

                Check for updates

                Qualifiers

                • research-article

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader