K Asynchronous Federated Learning with Cosine Similarity Based Aggregation on Non-IID Data

ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT VI(2024)

Cited 0|Views3
No score
Abstract
In asynchronous federated learning, each device updates the model independently as soon as it becomes available, without waiting for other devices. However, this approach is confronted with two critical challenges, namely the non-IID data and the staleness issue, which can adversely impact the performance of the model. To address these challenges, we propose a novel framework called Class-balanced K-Asynchronous Federated Learning (CKAFL). In this framework, we adopt a two-pronged approach, aiming to resolve the problems of non-IID and staleness separately on the client and server side. We give a novel evaluation method that employs cosine similarity to measure the staleness of a delayed gradient to optimize the aggregation algorithm on the server side. We introduce a class-balanced loss function to mitigate the non-IID data in the client side. To evaluate the effectiveness of CKAFL, we conduct extensive experiments on three commonly used datasets. The experimental results show that even when a large proportion of devices have stale updates, the proposed CKAFL framework presents its effectiveness by outperforming baselines on both non-IID and IID cases.
More
Translated text
Key words
Federated Learning,Asynchronous Learning,Non-IID Data
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined