A Comparative Study of Manual and Automated Testing for Industrial Control Software
(English)In: International Conference on Software Testing, Verification and Validation ICST 2017Conference paper (Refereed)
Automated test generation has been suggested as a way of creating tests at a lower cost. Nonetheless, it is not very well studied how such tests compare to manually written ones in terms of cost and effectiveness. This is particularly true for industrial control software, where strict requirements on both specification-based testing and code coverage typically are met with rigorous manual testing. To address this issue, we conducted a case study in which we compared manually and automatically created tests. We used recently developed real-world industrial programs written in the IEC 61131-3, a popular programming language for developing industrial control systems using programmable logic controllers. The results show that automatically generated tests achieve similar code coverage as manually created tests, but in a fraction of the time (an average improvement of roughly 90%). We also found that the use of an automated test generation tool does not result in better fault detection in terms of mutation score compared to manual testing. Specifically, manual tests more effectively detect logical, timer and negation type of faults, compared to automatically generated tests. The results underscore the need to further study how manual testing is performed in industrial practice and the extent to which automated test generation can be used in the development of reliable systems.
IdentifiersURN: urn:nbn:se:mdh:diva-34088OAI: oai:DiVA.org:mdh-34088DiVA: diva2:1056601
International Conference on Software Testing, Verification and Validation ICST 2017, 13 Mar 2017, Tokyo, Japan
ProjectsITS-EASY Post Graduate School for Embedded Software and SystemsTOCSYC - Testing of Critical System Characteristics (KKS)AGENTS - Automated Generation of Tests for Simulated Software Systems (KKS)