https://www.mdu.se/

mdu.sePublications
Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Tiny Federated Learning with Bayesian Classifiers
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0001-9857-4317
Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.ORCID iD: 0000-0001-5269-3900
2023 (English)In: IEEE Int Symp Ind Electron, Institute of Electrical and Electronics Engineers Inc. , 2023Conference paper, Published paper (Refereed)
Abstract [en]

Tiny machine learning (TinyML) represents an emerging research direction that aims to realize machine learning on Internet of Things (IoT) devices. The current TinyML research seems to focus on supporting the deployment of deep learning models on microprocessors, while the models themselves are trained on high performance computers or clouds. However, in the resource/time constrained IoT contexts, it is more desirable to perform data analytics and learning tasks directly on edge devices for crucial benefits such as increased energy efficiency, reduced latency as well as lower communication cost.To address the above challenge, this paper proposes a tiny federated learning algorithm for enabling learning of Bayesian classifiers based on distributed tiny data storage, referred to as TFL-BC. In TFL-BC, Bayesian learning is executed in parallel across multiple edge devices using local (tiny) training data and subsequently the learning results from local devices are aggregated via a central node to obtain the final classification model. The results of experiments conducted on a set of benchmark datasets demonstrate that our algorithm can produce final aggregated models that outperform single tiny Bayesian classifiers and that the result of tiny federated learning (of Bayesian classifier) is independent of the number of data partitions used for generating the distributed local training data.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2023.
Keywords [en]
Bayesian classifier, edge computing, federated learning, tiny machine learning, Classification (of information), Data Analytics, Deep learning, Digital storage, Energy efficiency, Learning algorithms, Learning systems, 'current, High performance computers, Learning models, Machine learning research, Machine-learning, Training data, Internet of things
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:mdh:diva-64430DOI: 10.1109/ISIE51358.2023.10228115Scopus ID: 2-s2.0-85172110028ISBN: 9798350399714 (print)OAI: oai:DiVA.org:mdh-64430DiVA, id: diva2:1803406
Conference
IEEE International Symposium on Industrial Electronics
Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2023-10-09Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Xiong, NingPunnekkat, Sasikumar

Search in DiVA

By author/editor
Xiong, NingPunnekkat, Sasikumar
By organisation
Embedded Systems
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 53 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf