Loading [MathJax]/extensions/MathZoom.js
Delta-Bench: Differential Benchmark for Static Analysis Security Testing Tools | IEEE Conference Publication | IEEE Xplore

Delta-Bench: Differential Benchmark for Static Analysis Security Testing Tools


Abstract:

Background: Static analysis security testing (SAST) tools may be evaluated using synthetic micro benchmarks and benchmarks based on real-world software. Aims: The aim of ...Show More

Abstract:

Background: Static analysis security testing (SAST) tools may be evaluated using synthetic micro benchmarks and benchmarks based on real-world software. Aims: The aim of this study is to address the limitations of the existing SAST tool benchmarks: lack of vulnerability realism, uncertain ground truth, and large amount of findings not related to analyzed vulnerability. Method: We propose Delta-Bench - a novel approach for the automatic construction of benchmarks for SAST tools based on differencing vulnerable and fixed versions in Free and Open Source (FOSS) repositories. To test our approach, we used 7 state of the art SAST tools against 70 revisions of four major versions of Apache Tomcat spanning 62 distinct Common Vulnerabilities and Exposures (CVE) fixes and vulnerable files totalling over 100K lines of code as the source of ground truth vulnerabilities. Results: Our experiment allows us to draw interesting conclusions (e.g., tools perform differently due to the selected benchmark). Conclusions: Delta-Bench allows SAST tools to be automatically evaluated on the real-world historical vulnerabilities using only the findings that a tool produced for the analysed vulnerability.
Date of Conference: 09-10 November 2017
Date Added to IEEE Xplore: 11 December 2017
ISBN Information:
Conference Location: Toronto, ON, Canada
Citations are not available for this document.

I. Introduction

Designing a benchmark with real-world software is a challenging task [1]. Therefore, existing approaches either insert bugs artificially [2], [3], or use historical bugs from the software repository of a project [4]. Artificial bug injection is often difficult to verify (see [2, p.2]), whilst historical vulnerabilities may represent only a subset of the ground truth.

Cites in Papers - |

Cites in Papers - IEEE (10)

Select All
1.
Pranet Sharma, Zhenpeng Shi, Şevval Şimşek, David Starobinski, David Sastre Medina, "Understanding Similarities and Differences Between Software Composition Analysis Tools", IEEE Security & Privacy, vol.23, no.1, pp.53-63, 2025.
2.
Zongjie Li, Zhibo Liu, Wai Kin Wong, Pingchuan Ma, Shuai Wang, "Evaluating C/C++ Vulnerability Detectability of Query-Based Static Application Security Testing Tools", IEEE Transactions on Dependable and Secure Computing, vol.21, no.5, pp.4600-4618, 2024.
3.
Domenico Gigante, Fabiano Pecorelli, Vita Santa Barletta, Andrea Janes, Valentina Lenarduzzi, Davide Taibi, Maria Teresa Baldassarre, "Resolving Security Issues via Quality-Oriented Refactoring: A User Study", 2023 ACM/IEEE International Conference on Technical Debt (TechDebt), pp.82-91, 2023.
4.
Saikat Chakraborty, Rahul Krishna, Yangruibo Ding, Baishakhi Ray, "Deep Learning Based Vulnerability Detection: Are We There Yet?", IEEE Transactions on Software Engineering, vol.48, no.9, pp.3280-3296, 2022.
5.
Xiaoxue Wu, Wei Zheng, Xin Xia, David Lo, "Data Quality Matters: A Case Study on Data Label Correctness for Security Bug Report Prediction", IEEE Transactions on Software Engineering, vol.48, no.7, pp.2541-2556, 2022.
6.
Miquel Martínez, Juan-Carlos Ruiz, Nuno Antunes, David de Andrés, Marco Vieira, "A Multi-Criteria Analysis of Benchmark Results With Expert Support for Security Tools", IEEE Transactions on Dependable and Secure Computing, vol.19, no.4, pp.2151-2164, 2022.
7.
Alex Groce, Iftekhar Ahmed, Josselin Feist, Gustavo Grieco, Jiri Gesi, Mehran Meidani, Qihong Chen, "Evaluating and Improving Static Analysis Tools Via Differential Mutation Analysis", 2021 IEEE 21st International Conference on Software Quality, Reliability and Security (QRS), pp.207-218, 2021.
8.
Ivan Pashchenko, Riccardo Scandariato, Antonino Sabetta, Fabio Massacci, "Secure Software Development in the Era of Fluid Multi-party Open Software and Services", 2021 IEEE/ACM 43rd International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER), pp.91-95, 2021.
9.
Gaojian Hao, Feng Li, Wei Huo, Qing Sun, Wei Wang, Xinhua Li, Wei Zou, "Constructing Benchmarks for Supporting Explainable Evaluations of Static Application Security Testing Tools", 2019 International Symposium on Theoretical Aspects of Software Engineering (TASE), pp.65-72, 2019.
10.
Reza M. Parizi, Kai Qian, Hossain Shahriar, Fan Wu, Lixin Tao, "Benchmark Requirements for Assessing Software Security Vulnerability Testing Tools", 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC), vol.01, pp.825-826, 2018.

Cites in Papers - Other Publishers (1)

1.
Francesc Mateo Tudela, Juan-Ramón Bermejo Higuera, Javier Bermejo Higuera, Juan-Antonio Sicilia Montalvo, Michael I. Argyros, "On Combining Static, Dynamic and Interactive Analysis Security Testing Tools to Improve OWASP Top Ten Security Vulnerability Detection in Web Applications", Applied Sciences, vol.10, no.24, pp.9119, 2020.
Contact IEEE to Subscribe

References

References is not available for this document.