Open this publication in new window or tab >>2023 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]
Federated Learning (FL) has emerged as a novel paradigm within machine learning (ML) that allows multiple devices to collaboratively train a shared ML model without sharing their private data with a central server. FL has gained popularity across various applications by eliminating the necessity for centralized data storage, thereby improving the confidentiality of sensitive information. Among the new FL applications, this thesis focuses on Speech Emotion Recognition (SER), which involves the analysis of audio signals from human speech to identify patterns and classify the conveyed emotions. When SER is implemented within a FL framework, even though speech data remains on local devices, new privacy challenges emerge during the training phase and the exchange of SER model update parameters between servers and clients. These challenges encompass the potential for privacy leakage and adversarial attacks, including model inversion attacks and membership or property inference attacks, which can be conducted by unauthorized or malicious parties to exploit the shared SER model, compromising client data confidentiality and revealing sensitive information.
While several privacy-preserving solutions have been developed to mitigate potential breaches in FL architectures, those are too generic to be easily integrated into specific applications. Furthermore, incorporating existing privacy-preserving mechanisms into the FL framework can increase communication and computational overheads, which may, in turn, compromise data utility and learning performance.
This thesis aims to propose privacy-preserving methods in FL for emerging security-critical applications such as SER while addressing the challenges related to their effect on performance. First, we categorize and analyze recent research on privacy-preserving mechanisms in FL, with a focus on assessing their effects on FL performance and how to balance privacy and performance across various applications. Second, we design an optimized FL setup tailored to SER applications in order to evaluate effects on performance and overhead. Third, we design and develop privacy-preserving mechanisms within FL to safeguard against potential privacy threats while ensuring the confidentiality of clients' data. Finally, we propose and evaluate new methods for FL in SER and integrate them with appropriate privacy-preserving mechanisms to achieve an optimal balance of privacy with efficiency, accuracy, as well as communication and computation overhead.
Place, publisher, year, edition, pages
Västerås: Mälardalens universitet, 2023
Series
Mälardalen University Press Licentiate Theses, ISSN 1651-9256 ; 349
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:mdh:diva-64679 (URN)978-91-7485-621-7 (ISBN)
Presentation
2023-12-14, Paros, Mälardalens universitet, Västerås, 13:00 (English)
Opponent
Supervisors
2023-11-072023-11-062023-11-23Bibliographically approved