skip to main content
10.1145/2483760.2483778acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

Dynamically validating static memory leak warnings

Published:15 July 2013Publication History

ABSTRACT

File Edit Options Buffers Tools TeX Help Memory leaks have significant impact on software availability, performance, and security. Static analysis has been widely used to find memory leaks in C/C++ programs. Although a static analysis is able to find all potential leaks in a program, it often reports a great number of false warnings. Manually validating these warnings is a daunting task, which significantly limits the practicality of the analysis. In this paper, we develop a novel dynamic technique that automatically validates and categorizes such warnings to unleash the power of static memory leak detectors. Our technique analyzes each warning that contains information regarding the leaking allocation site and the leaking path, generates test cases to cover the leaking path, and tracks objects created by the leaking allocation site. Eventually, warnings are classified into four categories: MUST-LEAK, LIKELY-NOT-LEAK, BLOAT, and MAY-LEAK. Warnings in MUST-LEAK are guaranteed by our analysis to be true leaks. Warnings in LIKELY-NOT-LEAK are highly likely to be false warnings. Although we cannot provide any formal guarantee that they are not leaks, we have high confidence that this is the case. Warnings in BLOAT are also not likely to be leaks but they should be fixed to improve performance. Using our approach, the developer's manual validation effort needs to be focused only on warnings in the category MAY-LEAK, which is often much smaller than the original set.

References

  1. GNU core utilities. http://www.gnu.org/software/coreutils/.Google ScholarGoogle Scholar
  2. The Klocwork static analysis tool. http://www.klocwork.com/.Google ScholarGoogle Scholar
  3. US-CERT vulnerability notes database. http://www.kb.cert.org/vuls.Google ScholarGoogle Scholar
  4. Yices: An SMT solver. http://yices.csl.sri.com/.Google ScholarGoogle Scholar
  5. The Coverity static analysis tools. http://www.coverity.com/, 2012.Google ScholarGoogle Scholar
  6. CREST: An automatic test generation tool for C. http://code.google.com/p/crest, 2012.Google ScholarGoogle Scholar
  7. HP Fortify. https://www.fortify.com, 2012.Google ScholarGoogle Scholar
  8. M. Arnold, M. Vechev, and E. Yahav. QVM: An efficient runtime for detecting defects in deployed systems. In OOPSLA, pages 143–162, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Babi´c, L. Martignoni, S. McCamant, and D. Song. Statically-directed dynamic automated test generation. In ISSTA, pages 12–22, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. A. Bessey, K. Block, B. Chelf, A. Chou, B. Fulton, S. Hallem, C. Henri-Gros, A. Kamsky, S. McPeak, and D. R. Engler. A few billion lines of code later: using static analysis to find bugs in the real world. Commun. ACM, 53(2):66–75, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. M. D. Bond and K. S. McKinley. Bell: Bit-encoding online memory leak detection. In ASPLOS, pages 61–72, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. D. Bond and K. S. McKinley. Tolerating memory leaks. In OOPSLA, pages 109–126, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. D. Bond and K. S. McKinley. Leak pruning. In ASPLOS, pages 277–288, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. C. Boogerd and L. Moonen. Prioritizing software inspection results using static profiling. In SCAM, pages 149–160, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. Cherem, L. Princehouse, and R. Rugina. Practical memory leak detection using guarded value-flow analysis. In PLDI, pages 480–491, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. J. Clause and A. Orso. LEAKPOINT: pinpointing the causes of memory leaks. In ICSE, pages 515–524, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. C. Csallner, Y. Smaragdakis, and T. Xie. DSD-Crasher: A hybrid analysis tool for bug finding. ACM Trans. Softw. Eng. Methodol., 17(2):8:1–8:37, May 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. H. Cui, G. Hu, J. Wu, and J. Yang. Verifying systems rules using rule-directed symbolic execution. In ASPLOS, pages 329–342, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. W. DePauw and G. Sevitsky. Visualizing reference patterns for solving memory leaks in Java. Concurrency: Practice and Experience, 12(14):1431–1454, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  20. I. Dillig, T. Dillig, and A. Aiken. Automated error diagnosis using abductive inference. In PLDI, pages 181–192, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. N. Dor, M. Rodeh, and S. Sagiv. Checking cleanness in linked lists. In SAS, pages 115–134, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. X. Ge, K. Taneja, T. Xie, and N. Tillmann. DyTa: dynamic symbolic execution guided with static verification results. In ICSE, pages 992–994, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. P. Godefroid, N. Klarlund, and K. Sen. DART: directed automated random testing. In PLDI, pages 213–223, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. B. S. Gulavani, T. A. Henzinger, Y. Kannan, A. V. Nori, and S. K. Rajamani. SYNERGY: a new algorithm for property checking. In FSE, pages 117–127, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. J. Ha, C. J. Rossbach, J. V. Davis, I. Roy, H. E. Ramadan, D. E. Porter, D. L. Chen, and E. Witchel. Improved error reporting for software that uses black-box components. In PLDI, pages 101–111, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. B. Hackett and R. Rugina. Region-based shape analysis with tracked locations. In POPL, pages 310–323, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. M. J. Harrold and G. Rothermel. Siemens programs. http://www.cc.gatech.edu/aristotle/Tools/subjects/.Google ScholarGoogle Scholar
  28. R. Hastings and B. Joyce. Purify: A tool for detecting memory leaks and access errors in C and C++ programs. In Winter 1992 USENIX Conference, pages 125–138, 1992.Google ScholarGoogle Scholar
  29. M. Hauswirth and T. M. Chilimbi. Low-overhead memory leak detection using adaptive statistical profiling. In ASPLOS, pages 156–164, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. S. Heckman and L. Williams. On establishing a benchmark for evaluating static analysis alert prioritization and classification techniques. In ESEM, pages 41–50, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. D. L. Heine and M. S. Lam. A practical flow-sensitive and context-sensitive C and C++ memory leak detector. In PLDI, pages 168–181, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. D. L. Heine and M. S. Lam. Static detection of leaks in polymorphic containers. In ICSE, pages 252–261, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. M. Jump and K. S. McKinley. Cork: Dynamic memory leak detection for garbage-collected languages. In POPL, pages 31–38, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. S. Kim and M. D. Ernst. Prioritizing warning categories by analyzing software history. In MSR, pages 27–, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. S. Kim and M. D. Ernst. Which warnings should i fix first? In FSE, pages 45–54, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. N. Kosmatov. All-paths test generation for programs with internal aliases. In ISSRE, pages 147–156, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. T. Kremenek, K. Ashcraft, J. Yang, and D. R. Engler. Correlation exploitation in error ranking. In FSE, pages 83–93, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. T. Kremenek and D. Engler. Z-ranking: using statistical analysis to counter the impact of static analysis approximations. In SAS, pages 295–315, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. J. Maebe, M. Ronsse, and K. D. Bosschere. Precise identification of memory leaks. In WODA, pages 25–31, 2004.Google ScholarGoogle Scholar
  40. B. Meredith. Omega: An instant leak detector tool for valgrind. http://www.brainmurders.eclipse.co.uk/omega.html, December 2008.Google ScholarGoogle Scholar
  41. N. Mitchell and G. Sevitsky. Leakbot: An automated and lightweight tool for diagnosing memory leaks in large Java applications. In ECOOP, pages 351–377, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  42. N. Nethercote and J. Seward. How to shadow every byte of memory used by a program. In VEE, pages 65–74, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. G. Novark, E. D. Berger, and B. G. Zorn. Efficiently and precisely locating memory leaks and bloat. In PLDI, pages 397–407, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. M. Orlovich and R. Rugina. Memory leak analysis by contradiction. In SAS, pages 405–424, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. F. Qin, S. Lu, and Y. Zhou. SafeMem: Exploiting ECC-memory for detecting memory leaks and memory corruption during production runs. In HPCA, pages 291–302, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. D. Rayside and L. Mendel. Object ownership profiling: A technique for finding and fixing memory leaks. In ASE, pages 194–203, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. J. R. Ruthruff, J. Penix, J. D. Morgenthaler, S. Elbaum, and G. Rothermel. Predicting accurate and actionable static analysis warnings: An experimental approach. In ICSE, pages 341–350, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. K. Sen, D. Marinov, and G. Agha. CUTE: a concolic unit testing engine for C. In FSE, pages 263–272, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Y. Sui, D. Ye, and J. Xue. Static memory leak detection using full-sparse value-flow analysis. In ISSTA, pages 254–264, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. K. Taneja, T. Xie, N. Tillmann, and J. de Halleux. eXpress: guided path exploration for efficient regression test generation. In ISSTA, pages 1–11, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Y. Tang, Q. Gao, and F. Qin. LeakSurvivor: Towards safely tolerating memory leaks for garbage-collected languages. In USENIX, pages 307–320, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Y. Xie and A. Aiken. Context- and path-sensitive memory leak detection. In FSE, pages 115–125, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. G. Xu, M. Arnold, N. Mitchell, A. Rountev, E. Schonberg, and G. Sevitsky. Finding low-utility data structures. In PLDI, pages 174–186, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. G. Xu, M. Arnold, N. Mitchell, A. Rountev, and G. Sevitsky. Go with the flow: Profiling copies to find runtime bloat. In PLDI, pages 419–430, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. G. Xu, M. D. Bond, F. Qin, and A. Rountev. Leakchaser: Helping programmers narrow down causes of memory leaks. In PLDI, pages 270–282, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. G. Xu and A. Rountev. Precise memory leak detection for Java software using container profiling. In ICSE, pages 151–160, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. S. Zhang, D. Saff, Y. Bu, and M. D. Ernst. Combined static and dynamic automated test generation. In ISSTA, pages 353–363, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Dynamically validating static memory leak warnings

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Conferences
                ISSTA 2013: Proceedings of the 2013 International Symposium on Software Testing and Analysis
                July 2013
                381 pages
                ISBN:9781450321594
                DOI:10.1145/2483760

                Copyright © 2013 ACM

                Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 15 July 2013

                Permissions

                Request permissions about this article.

                Request Permissions

                Check for updates

                Qualifiers

                • research-article

                Acceptance Rates

                Overall Acceptance Rate58of213submissions,27%

                Upcoming Conference

                ISSTA '24

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader