Mitigating False Positive Static Analysis Warnings: Progress, Challenges, and Opportunities | IEEE Journals & Magazine | IEEE Xplore

Mitigating False Positive Static Analysis Warnings: Progress, Challenges, and Opportunities


Abstract:

Static analysis (SA) tools can generate useful static warnings to reveal the problematic code snippets in a software system without dynamically executing the correspondin...Show More

Abstract:

Static analysis (SA) tools can generate useful static warnings to reveal the problematic code snippets in a software system without dynamically executing the corresponding source code. In the literature, static warnings are of paramount importance because they can easily indicate specific types of software defects in the early stage of a software development process, which accordingly reduces the maintenance costs by a substantial margin. Unfortunately, due to the conservative approximations of such SA tools, a large number of false positive (FP for short) warnings (i.e., they do not indicate real bugs) are generated, making these tools less effective. During the past two decades, therefore, many false positive mitigation (FPM for short) approaches have been proposed so that more accurate and critical warnings can be delivered to developers. This paper offers a detailed survey of research achievements on the topic of FPM. Given the collected 130 surveyed papers, we conduct a comprehensive investigation from five different perspectives. First, we reveal the research trends of this field. Second, we classify the existing FPM approaches into five different types and then present the concrete research progress. Third, we analyze the evaluation system applied to examine the performance of the proposed approaches in terms of studied SA tools, evaluation scenarios, performance indicators, and collected datasets, respectively. Fourth, we summarize the four types of empirical studies relating to SA warnings to exploit the insightful findings that are helpful to reduce FP warnings. Finally, we sum up 10 challenges unresolved in the literature from the aspects of systematicness, effectiveness, completeness, and practicability and outline possible research opportunities based on three emerging techniques in the future.
Published in: IEEE Transactions on Software Engineering ( Volume: 49, Issue: 12, December 2023)
Page(s): 5154 - 5188
Date of Publication: 02 November 2023

ISSN Information:

Funding Agency:


I. Introduction

STATIC analysis (SA) tools have been widely used in software quality assurance (SQA) activities to detect the potential problematic code snippets [99], [133], [136] of both commercial and open source software (OSS) systems. The reasons are as follows. First, plenty of software quality issues, such as coding defects [143], vulnerabilities [115], and code style violations [87], can be detected by SA tools. Therefore, various SQA resources (e.g., human costs and test suites) can be assigned more effectively to improve software quality based on the detection results of SA tools. Second, SA tools provide a simple and convenient way to detect quality issues in a target program without a process of dynamical execution. Instead, these tools retrieve a set of pre-defined common bug patterns that are summarized by software experts, and then report the information of all problematic code captured by the bug patterns. Notably, most SA tools are designed as flexible and lightweight tools (e.g., FindBugs [12], and PMD [41]), which can be used in the form of either independent command line tools or built-in components of some popular IDEs such as Eclipse and IntelliJ IDEA. As a result, developers could leverage SA tools to extract a set of warnings from the target software project and then manually review, understand, and fix them later [147].

Contact IEEE to Subscribe

References

References is not available for this document.