VFedCS: Optimizing Client Selection for Volatile Federated Learning

IEEE Internet of Things Journal(2022)

引用 4|浏览19
暂无评分
摘要
Federated learning (FL) has shown great potential as a privacy-preserving solution to training a centralized model based on local data from available clients. However, we argue that, over the course of training, the available clients may exhibit some volatility in terms of the client population, client data, and training status. Considering these volatilities, we propose a new learning scenario termed volatile federated learning (volatile FL) featuring set volatility, statistical volatility, and training volatility. The volatile client set along with the dynamic of clients' data and the unreliable nature of clients (e.g., unintentional shutdown and network instability) greatly increase the difficulty of client selection. In this article, we formulate and decompose the global problem into two subproblems based on alternating minimization. For an efficient settlement for the proposed selection problem, we quantify the impact of clients' data and resource heterogeneity for volatile FL and introduce the cumulative effective participation data (CEPD) as an optimization objective. Based on this, we propose upper confidence bound-based greedy selection, dubbed UCB-GS, to address the client selection problem in volatile FL. Theoretically, we prove that the regret of UCB-GS is strictly bounded by a finite constant, justifying its theoretical feasibility. Furthermore, experimental results show that our method significantly reduces the number of training rounds (by up to 62%) while increasing the global model's accuracy by 7.51%.
更多
查看译文
关键词
Client selection,combinatorial multiarm bandit (CMAB),submodular function,volatile client,volatile federated learning (volatile FL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要