Abstract
This paper focuses on a federated learning (FL) system that employs a base station as a central server while clients with limited computation capabilities perform local training. The limited bandwidth leads to that only a portion of clients can participate in each FL training round, and picking different clients can impact the performance of FL systems, requiring effective allocation of their computing resources. In FL systems, both model convergence and energy consumption are important performances. To this end, we formulate a multi-objective optimization problem (MOP) to simultaneously speed up model convergence and reduce energy consumption. To address the MOP, we propose a multi-objective algorithm (MOA) for FL systems to obtain a Pareto optimal solution set, where Tchebycheff approach is adopted to divide MOP into multiple single- objective problems and optimize them by differential evolution. The extensive experiments on Fashion-MNIST dataset in both i.i.d and non-i.i.d data settings illustrates that MOA outperforms other algorithms.
Original language | English |
---|---|
Title of host publication | Proceedings of 2023 IEEE/CIC International Conference on Communications in China (ICCC) |
Number of pages | 5 |
Publisher | IEEE |
Publication date | 2023 |
ISBN (Electronic) | 979-8-3503-4538-4 |
DOIs | |
Publication status | Published - 2023 |
Event | 12th IEEE/CIC lnternational Conference on Communications in China - Dalian, China Duration: 10 Aug 2023 → 12 Aug 2023 Conference number: 12 |
Conference
Conference | 12th IEEE/CIC lnternational Conference on Communications in China |
---|---|
Number | 12 |
Country/Territory | China |
City | Dalian |
Period | 10/08/2023 → 12/08/2023 |
Keywords
- Federated learning
- Multi-objective algorithm
- Differential evolution
- Decomposition