Date of Award

1-1-2019

Document Type

Masters Thesis

Degree Name

M.S.

Organizational Unit

Daniel Felix Ritchie School of Engineering and Computer Science, Computer Science

First Advisor

Anneliese Andrews, Ph.D.

Second Advisor

Scott Leutenegger, Ph.D.

Third Advisor

Michael Keables, Ph.D.

Keywords

Evaluation, Gaps, Quality, Software testing, Systematic mapping study, Techniques

Abstract

Software testing techniques are crucial for detecting faults in software and reducing the risk of using it. As such, it is important that we have a good understanding of how to evaluate these techniques for their efficiency, scalability, applicability, and effectiveness at finding faults. This thesis enhances our understanding of testing technique evaluations by providing an overview of the state of the art in research. To accomplish this we utilize a systematic mapping study; structuring the field and identifying research gaps and publication trends. We then present a small case study demonstrating how our mapping study can be used to assist researchers in evaluating their own software testing techniques. We find that a majority of evaluations are empirical evaluations in the form of case studies and experiments, most of these evaluations are of low quality based on proper methodology guidelines, and that relatively few papers in the field discuss how testing techniques should be evaluated.

Publication Statement

Copyright is held by the author. User is responsible for all copyright compliance.

Rights Holder

Mitchell Mayeda

Provenance

Received from ProQuest

File Format

application/pdf

Language

en

File Size

141 p.

Discipline

Computer science



Share

COinS