Limitations and Future Aspects of Communication Costs in Federated Learning: A Survey

Muhammad Asad, Saima Shaukat, Dou Hu, Zekun Wang, Ehsan Javanmardi, Jin Nakazato, Manabu Tsukada

研究成果: Review article査読

23 被引用数 (Scopus)

抄録

This paper explores the potential for communication-efficient federated learning (FL) in modern distributed systems. FL is an emerging distributed machine learning technique that allows for the distributed training of a single machine learning model across multiple geographically distributed clients. This paper surveys the various approaches to communication-efficient FL, including model updates, compression techniques, resource management for the edge and cloud, and client selection. We also review the various optimization techniques associated with communication-efficient FL, such as compression schemes and structured updates. Finally, we highlight the current research challenges and discuss the potential future directions for communication-efficient FL.

本文言語English
論文番号7358
ジャーナルSensors
23
17
DOI
出版ステータスPublished - 9月 2023

フィンガープリント

「Limitations and Future Aspects of Communication Costs in Federated Learning: A Survey」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル