5W+1H static analysis report quality measure

Maxim Menshchikov, Timur Lepikhin

Research output

1 Citation (Scopus)

Abstract

Modern development best practices rank static analysis quite high in a list of quality assurance methods. Static analyzers indicate errors found and help improve software quality. However, the quality of reports is merely evaluated, if done at all. In this paper we generalize analyzer output messages and explore ways to improve reliability of comparison results. We introduce informational value as a measure of report quality with respect to 5Ws (What, When, Where, Who, Why) and 1H (How To Fix) questions, formulate and verify a hypothesis about its independence on generic quality measures, suggest a methodology to include it into static analysis benchmarking and present our observations after testing, which might help tool developers choose the direction towards more understandable reports.

Original languageEnglish
Pages (from-to)114-126
Number of pages13
JournalCommunications in Computer and Information Science
Volume779
DOIs
Publication statusPublished - 2018
EventInternational Conference on Tools and Methods of Program Analysis - Moscow
Duration: 2 Mar 20173 Mar 2017
Conference number: 4
https://tmpaconf.org/ru/past-events/tmpa-2017/about-2017-ru

Fingerprint

Quality Measures
Static analysis
Static Analysis
Software Quality
Quality Assurance
Best Practice
Comparison Result
Benchmarking
Quality assurance
Choose
Verify
Testing
Generalise
Methodology
Output
Independence
Observation

Scopus subject areas

  • Computer Science(all)
  • Mathematics(all)

Cite this

Menshchikov, Maxim ; Lepikhin, Timur. / 5W+1H static analysis report quality measure. In: Communications in Computer and Information Science. 2018 ; Vol. 779. pp. 114-126.
@article{d73b94ad5b69438ab8d098a09be250f1,
title = "5W+1H static analysis report quality measure",
abstract = "Modern development best practices rank static analysis quite high in a list of quality assurance methods. Static analyzers indicate errors found and help improve software quality. However, the quality of reports is merely evaluated, if done at all. In this paper we generalize analyzer output messages and explore ways to improve reliability of comparison results. We introduce informational value as a measure of report quality with respect to 5Ws (What, When, Where, Who, Why) and 1H (How To Fix) questions, formulate and verify a hypothesis about its independence on generic quality measures, suggest a methodology to include it into static analysis benchmarking and present our observations after testing, which might help tool developers choose the direction towards more understandable reports.",
keywords = "Informational value, Report quality measure, Static analysis, Understandability of reports",
author = "Maxim Menshchikov and Timur Lepikhin",
year = "2018",
doi = "10.1007/978-3-319-71734-0_10",
language = "English",
volume = "779",
pages = "114--126",
journal = "Communications in Computer and Information Science",
issn = "1865-0929",
publisher = "Springer",

}

5W+1H static analysis report quality measure. / Menshchikov, Maxim; Lepikhin, Timur.

In: Communications in Computer and Information Science, Vol. 779, 2018, p. 114-126.

Research output

TY - JOUR

T1 - 5W+1H static analysis report quality measure

AU - Menshchikov, Maxim

AU - Lepikhin, Timur

PY - 2018

Y1 - 2018

N2 - Modern development best practices rank static analysis quite high in a list of quality assurance methods. Static analyzers indicate errors found and help improve software quality. However, the quality of reports is merely evaluated, if done at all. In this paper we generalize analyzer output messages and explore ways to improve reliability of comparison results. We introduce informational value as a measure of report quality with respect to 5Ws (What, When, Where, Who, Why) and 1H (How To Fix) questions, formulate and verify a hypothesis about its independence on generic quality measures, suggest a methodology to include it into static analysis benchmarking and present our observations after testing, which might help tool developers choose the direction towards more understandable reports.

AB - Modern development best practices rank static analysis quite high in a list of quality assurance methods. Static analyzers indicate errors found and help improve software quality. However, the quality of reports is merely evaluated, if done at all. In this paper we generalize analyzer output messages and explore ways to improve reliability of comparison results. We introduce informational value as a measure of report quality with respect to 5Ws (What, When, Where, Who, Why) and 1H (How To Fix) questions, formulate and verify a hypothesis about its independence on generic quality measures, suggest a methodology to include it into static analysis benchmarking and present our observations after testing, which might help tool developers choose the direction towards more understandable reports.

KW - Informational value

KW - Report quality measure

KW - Static analysis

KW - Understandability of reports

UR - http://www.scopus.com/inward/record.url?scp=85040256491&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-71734-0_10

DO - 10.1007/978-3-319-71734-0_10

M3 - Article

AN - SCOPUS:85040256491

VL - 779

SP - 114

EP - 126

JO - Communications in Computer and Information Science

JF - Communications in Computer and Information Science

SN - 1865-0929

ER -