mdh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Influential Nuisance Factors on a Decision of Sufficient Testing
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0003-4127-5839
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems. University of York, York, UK.ORCID iD: 0000-0003-2415-8219
2015 (English)In: Algorithms and Architectures for Parallel Processing: ICA3PP International Workshops and Symposiums, Zhangjiajie, China, November 18–20, 2015, Proceedings, 2015, p. 819-828Conference paper, Published paper (Refereed)
Abstract [en]

Testing of safety-critical embedded systems is an important and costly endeavor. To date work has been mainly focusing on the design and application of diverse testing strategies. However, they have left an open research issue of when to stop testing a system. In our previous work, we proposed a convergence algorithm that informs the tester when the current testing strategy does not seem to be revealing new insight into the worst-case timing properties of system tasks, hence, should be stopped. This algorithm was shown to be successful while being applied across task sets having similar characteristics. For the convergence algorithm to become robust, it is important that it holds even if the task set characteristics here called nuisance factors, vary. Generally speaking, there might be either the main factors under analysis, called design factors, or nuisance factors that influence the performance of a process or system. Nuisance factors are not typically of interest in the context of the analysis. However, they vary from system to system and may have large effects on the performance, hence, being very important to be accounted for. Consequently, the current paper looks into a set of nuisance factors that affect our proposed convergence algorithm performance. More specifically, it is interested in situations when the convergence algorithm performance significantly degrades influencing its reliability. The work systematically analyzes each nuisance factor effect using a well-known statistical method, further, derives the most influential factors.

Place, publisher, year, edition, pages
2015. p. 819-828
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 9352
Keywords [en]
Testing, Safety, ALARP, Nuisance factor, Real-time system, ANOVA, Analysis of variance
National Category
Computer Systems
Identifiers
URN: urn:nbn:se:mdh:diva-30474DOI: 10.1007/978-3-319-27161-3_75ISI: 000373630000075Scopus ID: 2-s2.0-84951948384ISBN: 978-3-319-27160-6 (print)OAI: oai:DiVA.org:mdh-30474DiVA, id: diva2:886001
Conference
The 15th International Conference on Algorithms and Architectures for Parallel Processing ICA3PP'15, 18-20 Nov 2015, Zhangjiajie, China
Projects
SYNOPSIS - Safety Analysis for Predictable Software Intensive SystemsAvailable from: 2015-12-21 Created: 2015-12-21 Last updated: 2016-08-19Bibliographically approved
In thesis
1. An ALARP Stop-Test Decision for the Worst-Case Timing Characteristics of Safety-Critical Systems
Open this publication in new window or tab >>An ALARP Stop-Test Decision for the Worst-Case Timing Characteristics of Safety-Critical Systems
2016 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Safety-critical systems are those in which failure can lead to loss of people’s lives, or catastrophic damage to the environment. Timeliness is an important requirement in safety-critical systems, which relates to the notion of response time, i.e., the time a system takes to respond to stimuli from the environment. If the response time exceeds a specified time interval, a catastrophe might occur.

 

Stringent timing requirements make testing a necessary and important process with which not only the correct system functionality has to be verified but also the system timing behaviour. However, a key issue for testers is to determine when to stop testing, as stopping too early may result in defects remaining in the system, or a catastrophe due to high severity level of undiscovered defects; and stopping too late will result in waste of time and resources. To date, researchers and practitioners have mainly focused on the design and application of diverse testing strategies, leaving the critical stop-test decision a largely open issue, especially with respect to timeliness.

 

In the first part of this thesis, we propose a novel approach to make a stop-test decision in the context of testing the worst-case timing characteristics of systems. More specifically, we propose a convergence algorithm that informs the tester whether further testing would reveal significant new insight into the timing behaviour of the system, and if not, it suggests testing to be stopped. The convergence algorithm looks into the observed response times achieved by testing, and examines whether the Maximum Observed Response Time (MORT) has recently increased, and when this is no longer the case, it investigates if the distribution of response times has changed significantly. When no significant new information about the system is revealed during a given period of time it is concluded, with some statistical confidence, that more testing of the same nature is not going to be useful. However, some other testing techniques may still achieve significant new findings.

 

Furthermore, the convergence algorithm is evaluated based on the As Low As Reasonably Practicable (ALARP)  principle which is an underpinning concept in most safety standards. ALARP involves weighting benefit against the associated cost. In order to evaluate the convergence algorithm, it is shown that the sacrifice, here testing time, would be grossly disproportionate compared to the benefit attained, which in this context is any further significant increase in the MORT after stopping the test.

 

Our algorithm includes a set of tunable parameters. The second part of this work is to improve the algorithm performance and scalability through the following steps: firstly, it is determined whether the parameters do affect the algorithm. Secondly, the most influential parameters are identified and tuned. This process is based on the Design of Experiment (DoE)  approach.

 

Moreover, the algorithm is required to be robust, which in this context is defined “the algorithm provides valid stop-test decisions across a required range of task sets”. For example, if the system’s number of tasks varies from 10 to 50 tasks and the tasks’ periods change from the range [200 μ s, 400 μ s] to the range [200 μ s, 1000 μ s], the algorithm performance would not be adversely affected. In order to achieve robustness, firstly, the most influential task set parameters on the algorithm performance are identified by the Analysis of Variance (ANOVA)  approach. Secondly, it is examined whether the algorithm is sound over some required ranges of those parameters, and if not, the situations in which the algorithm’s performance significantly degrades are identified. Then, these situations will be used in our future work to stress test the algorithm and to tune it so that it becomes robust across the required ranges.

 

Finally, the convergence algorithm was shown to be successful while being applied on task sets having similar characteristics. However, we observe some experiments in which the algorithm could not suggest a proper stop-test decision in compliance to the ALARP principle, e.g., it stops sooner than expected. Therefore, we examine whether the algorithm itself can be further improved focusing on the statistical test it uses and if another test would perform better.

Place, publisher, year, edition, pages
Västerås: Mälardalen University, 2016
Series
Mälardalen University Press Licentiate Theses, ISSN 1651-9256 ; 238
National Category
Computer and Information Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-32588 (URN)978-91-7485-279-0 (ISBN)
Presentation
2016-09-19, Gamma, Mälardalens högskola, Västerås, 13:00 (English)
Opponent
Supervisors
Available from: 2016-08-19 Created: 2016-08-18 Last updated: 2018-01-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Malekzadeh, MahnazBate, Iain

Search in DiVA

By author/editor
Malekzadeh, MahnazBate, Iain
By organisation
Embedded Systems
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 17 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf