https://www.mdu.se/

mdu.sePublications
Change search
Link to record
Permanent link

Direct link
Publications (10 of 42) Show all publications
Eldh, S. (2024). A 40-Year Impact Perspective: Meet Your New Editor in Chief. IEEE Software, 41(1), 4-7
Open this publication in new window or tab >>A 40-Year Impact Perspective: Meet Your New Editor in Chief
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 1, p. 4-7Article in journal, Editorial material (Other academic) Published
Abstract [en]

Forty years since IEEE Software Magazine began, its impact is shown today. We must constantly ask ourselves if we are building the product right and if are we building the right products, as new ways of developing software challenges both research and industry.

Place, publisher, year, edition, pages
IEEE Computer Society, 2024
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-65789 (URN)10.1109/MS.2023.3328199 (DOI)001132030400005 ()2-s2.0-85181131035 (Scopus ID)
Available from: 2024-01-31 Created: 2024-01-31 Last updated: 2024-01-31Bibliographically approved
Eldh, S. (2024). Are We Keeping up With the Innovation in Generative AI?. IEEE Software, 41(6), 4-8
Open this publication in new window or tab >>Are We Keeping up With the Innovation in Generative AI?
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 6, p. 4-8Article in journal (Refereed) Published
Abstract [en]

Achieving automatic, valid, verified answers from generative AI models, is a goal that many of us are working on. We must find a way to make sure we have reliable and trustworthy results despite numerous challenges in the field.

Place, publisher, year, edition, pages
IEEE COMPUTER SOC, 2024
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:mdh:diva-69418 (URN)10.1109/MS.2024.3441908 (DOI)001329864000002 ()2-s2.0-85205950825 (Scopus ID)
Available from: 2024-12-11 Created: 2024-12-11 Last updated: 2024-12-11Bibliographically approved
Eldh, S. (2024). Code Review Evolution. IEEE Software, 41(5), 4-8
Open this publication in new window or tab >>Code Review Evolution
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 5, p. 4-8Article in journal, Editorial material (Refereed) Published
Place, publisher, year, edition, pages
IEEE Computer Society, 2024
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-68216 (URN)10.1109/MS.2024.3416648 (DOI)001291162500002 ()2-s2.0-85201075055 (Scopus ID)
Available from: 2024-08-21 Created: 2024-08-21 Last updated: 2024-08-28Bibliographically approved
Eldh, S. (2024). Generative AI Is Changing How and What We Learn. IEEE Software, 41(2), 4-5
Open this publication in new window or tab >>Generative AI Is Changing How and What We Learn
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 2, p. 4-5Article in journal (Refereed) Published
Abstract [en]

This issue is tackling the Future of Software Engineering Education and Training in the Age of AI. Generative AI tools will change how we learn. A new, more precise language is needed to communicate better with AI tools. Learn prompt engineering!- Sigrid Eldh, EIC IEEE Software.

Place, publisher, year, edition, pages
IEEE COMPUTER SOC, 2024
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:mdh:diva-66355 (URN)10.1109/MS.2023.3346069 (DOI)001183977400001 ()2-s2.0-85188206907 (Scopus ID)
Available from: 2024-04-03 Created: 2024-04-03 Last updated: 2024-04-03Bibliographically approved
Han, F., Eldh, S., Wiklund, K., Ermedah, A., Haller, P. & Artho, C. (2024). In Industrial Embedded Software, are Some Compilation Errors Easier to Localize and Fix than Others?. In: Proceedings - 2024 IEEE Conference on Software Testing, Verification and Validation, ICST 2024: . Paper presented at 17th IEEE Conference on Software Testing, Verification and Validation, ICST 2024, Toronto, Canada, 27-31 May, 2024 (pp. 383-394). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>In Industrial Embedded Software, are Some Compilation Errors Easier to Localize and Fix than Others?
Show others...
2024 (English)In: Proceedings - 2024 IEEE Conference on Software Testing, Verification and Validation, ICST 2024, Institute of Electrical and Electronics Engineers (IEEE), 2024, p. 383-394Conference paper, Published paper (Refereed)
Abstract [en]

Industrial embedded systems often require special-ized hardware. However, software engineers have access to such domain-specific hardware only at the continuous integration (CI) stage and have to use simulated hardware otherwise. This results in a higher proportion of compilation errors at the CI stage than in other types of systems, warranting a deeper study. To this end, we create a CI diagnostics solution called 'Shadow Job' that analyzes our industrial CI system. We collected over 40000 builds from 4 projects from the product source code and categorized the compilation errors into 14 error types, showing that the five most common ones comprise 89 % of all compilation errors. Additionally, we analyze the resolution time, size, and distance for each error type, to see if different types of compilation errors are easier to localize or repair than others. Our results show that the resolution time, size, and distance are independent of each other. Our research also provides insights into the human effort required to fix the most common industrial compilation errors. We also identify the most promising directions for future research on fault localization.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2024
Keywords
compilation error, continuous integration, fault localization, software build, Coding errors, Continuous integrations, Domain specific, Embedded-system, Error types, Integration systems, Resolution time, Specific hardware, Embedded software
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-68533 (URN)10.1109/ICST60714.2024.00042 (DOI)001307930000034 ()2-s2.0-85203842024 (Scopus ID)9798350308181 (ISBN)
Conference
17th IEEE Conference on Software Testing, Verification and Validation, ICST 2024, Toronto, Canada, 27-31 May, 2024
Available from: 2024-09-27 Created: 2024-09-27 Last updated: 2024-10-30Bibliographically approved
Eldh, S. (2024). Let Us Thrive Well-Being for Humanity!. IEEE Software, 41(4), 4-5
Open this publication in new window or tab >>Let Us Thrive Well-Being for Humanity!
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 4, p. 4-5Article in journal (Refereed) Published
Abstract [en]

What is the best pathway to acquire new knowledge, create a positive work-life balance, and to plan and adapt successfully? Acceptance, tolerance, and inclusiveness: embracing these practices is key to transforming our outlook and attaining higher motivation and great results.

Place, publisher, year, edition, pages
IEEE Computer Society, 2024
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:mdh:diva-67670 (URN)10.1109/MS.2024.3385454 (DOI)001241761200005 ()2-s2.0-85195783616 (Scopus ID)
Available from: 2024-06-19 Created: 2024-06-19 Last updated: 2024-06-19Bibliographically approved
Eldh, S. (2024). Making Your Ideas Successful. IEEE Software, 41(3), 4-6
Open this publication in new window or tab >>Making Your Ideas Successful
2024 (English)In: IEEE Software, ISSN 0740-7459, E-ISSN 1937-4194, Vol. 41, no 3, p. 4-6Article in journal (Refereed) Published
Abstract [en]

Impact can be anything from a software solution, a product, or even changing people's minds about something. Successfully turning an idea into something that has an impact requires alignment among many contributing factors. This issue of IEEE Software gives us more insight into this journey from idea to impact.

Place, publisher, year, edition, pages
IEEE COMPUTER SOC, 2024
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:mdh:diva-69424 (URN)10.1109/MS.2024.3363544 (DOI)001197802900008 ()2-s2.0-85190105014 (Scopus ID)
Available from: 2024-12-11 Created: 2024-12-11 Last updated: 2024-12-11Bibliographically approved
Eldh, S. (2023). From Data Analysis to Human Input: Navigating the Complexity of Software Evaluation and Assessment. In: EASE'23: Proceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering. Paper presented at The International Conference on Evaluation and Assessment in Software Engineering Oulu Finland June 14 - 16, 2023 (pp. 1-1). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>From Data Analysis to Human Input: Navigating the Complexity of Software Evaluation and Assessment
2023 (English)In: EASE'23: Proceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering, Association for Computing Machinery (ACM), 2023, p. 1-1Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

It is the time of trust and transformation in software. We want explainable AI to assist us in dialogue, write our programs, test our software, and improve how we communicate. It is the time of digitalization, but we must ask ourselves - on what data in what format, when do we collect it, and what is the source? Does “data” make sense? Every action can be automated, should eventually be automated, and as such should be traceable and explainable. The transformation of software – and how we can now train, and feedback in a fast way, enable us to not only utilize existing technologies, but also aids us in faster embracing new technologies. This transformation is much to slow even if things change at a lightning speed. Change is the only thing we can be sure will happen. Evaluating and assessing quality of software sounds easy but is only as good as you design it to be. We, often simplify the problem so we can move forward, but it is the complications that is the real issue – our context, our combination of tools, languages, hardware, history, and way of working. We simply need the labeling, the meta-data, the context – and this data in a form with “many” perspectives to draw the more “accurate” scientific picture. Having a multi-facetted perspective is important when analyzing complex contexts. In software, listening skills and asking the right questions to the right people is often invaluable to complement blunt data. On the other side - much information is probably missing as you are too easily getting “only” what you asked for. So, we cannot judge what we cannot observe – and analyzing this data, is another issue all together. We need to know what is right – because if we cannot trust the source – or double check the outcome, how would we know it is not just a “fake” data? What does the outlier really mean? Is it a sign of a new trend is it the first time we capture this odd event? Therefore, it is easy to lose perspective in a fast-changing world. Despite drowning in tools, we still miss a lot of them. The threshold of using a tool is high, as we cannot trust them, and we cannot be sure that the data these tools collect does represent what we want to investigate. Therefore, the role of the scientist is more important than ever. Trusting the scientific process, utilizing multiple methods, and combining them is the receipt! Another goal is doing our best to select topics and collaborators – as building better software (quality) for humanity. It starts with you and me. I hope I will in this context be able to touch upon areas like security, testing, automation, AI/ML, ethics and “human in the loop”, analysis, tools, and technical debt, with a focus on evaluations and assessments.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-69574 (URN)10.1145/3593434.3596439 (DOI)9798400700446 (ISBN)
Conference
The International Conference on Evaluation and Assessment in Software Engineering Oulu Finland June 14 - 16, 2023
Available from: 2024-12-13 Created: 2024-12-13 Last updated: 2024-12-13Bibliographically approved
Eldh, S. (2022). On technical debt in software testing - observations from industry. In: Lecture Notes in Computer Science, vol. 13702: . Paper presented at 11th International Symposium on Leveraging Applications of Formal Methods, Verification and Validation, ISoLA 2022, Rhodes, 22 October 2022through 30 October 2022 (pp. 301-323). Springer Science and Business Media Deutschland GmbH
Open this publication in new window or tab >>On technical debt in software testing - observations from industry
2022 (English)In: Lecture Notes in Computer Science, vol. 13702, Springer Science and Business Media Deutschland GmbH , 2022, p. 301-323Conference paper, Published paper (Refereed)
Abstract [en]

Testing large complex systems in an agile way of working was a tough transition for systems having large active legacy and honouring backward compatibility. Transition from manual test to full test execution automation resulted in increased speed and manifested technical debt. The agile way of working in continuous build and test, creates a lot of repetition by execution of the same tests. Overlap between agile teams producing similar test cases, causes a constant growth of the test suites. Despite the obvious improvement of automating millions of test cases, the numbers provide a false sense of security for management on how well the system is tested. The causes of technical debt should be addressed, instead of managing the symptoms. Technical debt in software testing could be addressed by refactoring, supported by known techniques like cloning, similarity analysis, test suite reduction, optimization and reducing known test smells. Increasing the system quality can also be improved by utilizing metrics, e.g. code coverage and mutation score or use one of the many automated test design technologies. Why this is not addressed in the industry has many causes. In this paper we describe observations from several industries, with the focus on large complex systems. The contribution lies in reflecting on observations made in the last decade, and providing a vision which identifies improvements in the area of test automation and technical debt in software test, i.e. test code, test suites, test organisation, strategy and execution. Our conclusion is that many test technologies are now mature enough to be brought into regular use. The main hindrance is skills and incentive to do so for the developer, as well as a lack of well educated testers. 

Place, publisher, year, edition, pages
Springer Science and Business Media Deutschland GmbH, 2022
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 13702 LNCS
Keywords
Agile development, Industry testing, Quality assurance, Technical debt, Test automation, Test maintenance, Test strategies, Agile manufacturing systems, Automation, Codes (symbols), Legacy systems, Testing, Large complex systems, Software testings, Technical debts, Test case, Test execution, Test maintenances, Software testing
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-61161 (URN)10.1007/978-3-031-19756-7_17 (DOI)2-s2.0-85142686832 (Scopus ID)9783031197550 (ISBN)
Conference
11th International Symposium on Leveraging Applications of Formal Methods, Verification and Validation, ISoLA 2022, Rhodes, 22 October 2022through 30 October 2022
Available from: 2022-12-07 Created: 2022-12-07 Last updated: 2022-12-07Bibliographically approved
Fu, H., Eldh, S., Wiklund, K., Ermedahl, A. & Artho, C. (2022). Prevalence of continuous integration failures in industrial systems with hardware-in-the-loop testing. In: Proceedings - 2022 IEEE International Symposium on Software Reliability Engineering Workshops, ISSREW 2022: . Paper presented at 33rd IEEE International Symposium on Software Reliability Engineering Workshops, ISSREW 2022, Virtual, Online, 31 October 2022 through 3 November 2022 (pp. 61-66). Institute of Electrical and Electronics Engineers Inc.
Open this publication in new window or tab >>Prevalence of continuous integration failures in industrial systems with hardware-in-the-loop testing
Show others...
2022 (English)In: Proceedings - 2022 IEEE International Symposium on Software Reliability Engineering Workshops, ISSREW 2022, Institute of Electrical and Electronics Engineers Inc. , 2022, p. 61-66Conference paper, Published paper (Refereed)
Abstract [en]

Faults in the automated continuous integration (CI) process can seriously impact the development of industrial code. To reduce manual intervention in automated CI processes, we want to understand better the CI systems' failure distribution to improve efficiency, reliability, and maintainability. This paper investigates failures in CI in four large industrial projects. We gather 11 731 builds over six months, identifying 1 414 failing builds. We also identify the distribution of different types of build failures in each of the four CI projects. Our results show that compilation is the most significant individual cause of failure with 47 %, followed by testing at 36 %. The checkout step with associated checks also incurs a non-negligible portion of failures with 12 %. Furthermore, we identify 14 distinct types of failures in the testing step. We conclude that configuration problems are a significant issue, as pipeline scripting and dependency errors make up a large number of failures. © 2022 IEEE.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc., 2022
Keywords
continuous integration, embedded system, failure classification, industry study, Integration, Integration testing, Continuous integrations, Embedded-system, Hardware-in-the-loop testing, Industrial codes, Industrial systems, Integration process, Integration systems, Manual intervention, Embedded systems
National Category
Software Engineering
Identifiers
urn:nbn:se:mdh:diva-61724 (URN)10.1109/ISSREW55968.2022.00040 (DOI)000909333700011 ()2-s2.0-85146335651 (Scopus ID)9781665476799 (ISBN)
Conference
33rd IEEE International Symposium on Software Reliability Engineering Workshops, ISSREW 2022, Virtual, Online, 31 October 2022 through 3 November 2022
Available from: 2023-02-01 Created: 2023-02-01 Last updated: 2023-03-01Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-5070-9312

Search in DiVA

Show all publications