skip to main content
10.1145/3239235.3268920acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Vulnerable open source dependencies: counting those that matter

Published:11 October 2018Publication History

ABSTRACT

Background: Vulnerable dependencies are a known problem in today's open-source software ecosystems because OSS libraries are highly interconnected and developers do not always update their dependencies.

Aim: Our paper addresses the over-inflation problem of academic and industrial approaches for reporting vulnerable dependencies in OSS software, and therefore, caters to the needs of industrial practice for correct allocation of development and audit resources.

Method: Careful analysis of deployed dependencies, aggregation of dependencies by their projects, and distinction of halted dependencies allow us to obtain a counting method that avoids over-inflation. To understand the industrial impact of a more precise approach, we considered the 200 most popular OSS Java libraries used by SAP in its own software. Our analysis included 10905 distinct GAVs (group, artifact, version) in Maven when considering all the library versions.

Results: We found that about 20% of the dependencies affected by a known vulnerability are not deployed, and therefore, they do not represent a danger to the analyzed library because they cannot be exploited in practice. Developers of the analyzed libraries are able to fix (and actually responsible for) 82% of the deployed vulnerable dependencies. The vast majority (81%) of vulnerable dependencies may be fixed by simply updating to a new version, while 1% of the vulnerable dependencies in our sample are halted, and therefore, potentially require a costly mitigation strategy.

Conclusions: Our case study shows that the correct counting allows software development companies to receive actionable information about their library dependencies, and therefore, correctly allocate costly development and audit resources, which is spent inefficiently in case of distorted measurements.

References

  1. S. S. Alqahtani, E. E. Eghan, and J. Rilling. Tracing known security vulnerabilities in software repositories-a semantic web enabled modeling approach. Sci. Comp. Program., 121:153--175, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. G. Brown. Statistical forecasting for inventory control. McGraw/Hill, 1959.Google ScholarGoogle Scholar
  3. M. Cadariu, E. Bouwers, J. Visser, and A. van Deursen. Tracking known security vulnerabilities in proprietary software systems. In Proc. of SANER'15, pages 516--519. IEEE, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  4. J. Cox, E. Bouwers, M. van Eekelen, and J. Visser. Measuring dependency freshness in software systems. In Proc. of ICSE'15, ICSE '15, pages 109--118, Piscataway, NJ, USA, 2015. IEEE Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. S. Dashevskyi, A. D. Brucker, and F. Massacci. A screening test for disclosed vulnerabilities in foss components. TSE, 2018.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. Hejderup. In dependencies we trust: How vulnerable are dependencies in software modules? 2015.Google ScholarGoogle Scholar
  7. R. Kikas, G. Gousios, M. Dumas, and D. Pfahl. Structure and evolution of package dependency networks. In Proc. of MSR'17, pages 102--112. IEEE, 2017. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. R. G. Kula, D. M. German, A. Ouni, T. Ishio, and K. Inoue. Do developers update their library dependencies? Emp. Soft. Eng. Journ., May 2017. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. T. Lauinger, A. Chaabane, S. Arshad, W. Robertson, C. Wilson, and E. Kirda. Thou shalt not depend on me: Analysing the use of outdated javascript libraries on the web. In Proc. of NDSS'17, 2017.Google ScholarGoogle ScholarCross RefCross Ref
  10. D. Merkel. Docker: lightweight linux containers for consistent development and deployment. LJ, 2014(239):2, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. V. H. Nguyen, S. Dashevskyi, and F. Massacci. An automatic method for assessing the versions affected by a vulnerability. Emp. Soft. Eng. Journ., 21(6):2268--2297, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. V. H. Nguyen and F. Massacci. The (un) reliability of nvd vulnerable versions data: An empirical experiment on google chrome vulnerabilities. In Proc. of ASIACCS'13, pages 493--498. ACM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. Pittenger. Open source security analysis: The state of open source security in commercial applications. Technical report, Black Duck Software, 2016.Google ScholarGoogle Scholar
  14. H. Plate, S. E. Ponta, and A. Sabetta. Impact assessment for vulnerabilities in open-source software libraries. In Proc. of ICSME'15, pages 411--420. IEEE, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. E. Ponta, H. Plate, and A. Sabetta. Beyond metadata: Code-centric and usage-based analysis of known vulnerabilities in open-source software. In 2018 IEEE International Conference on Software Maintenance and Evolution (ICSME), 2018.Google ScholarGoogle ScholarCross RefCross Ref
  16. D. J. Reifer, V. R. Basili, B. W. Boehm, and B. Clark. Eight lessons learned during cots-based systems maintenance. IEEE Softw. Journ., 20(5):94--96, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. H. Sajnani, V. Saini, J. Ossher, and C. V. Lopes. Is popularity a measure of quality? an analysis of maven components. In Proc. of ICSME'14, pages 231--240. IEEE, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. J. Williams and A. Dabirsiaghi. The unfortunate reality of insecure libraries. Asp. Sec., pages 1--26, 2012.Google ScholarGoogle Scholar

Index Terms

  1. Vulnerable open source dependencies: counting those that matter

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          ESEM '18: Proceedings of the 12th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement
          October 2018
          487 pages
          ISBN:9781450358231
          DOI:10.1145/3239235

          Copyright © 2018 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 11 October 2018

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate130of594submissions,22%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader