Federated Multi-Task Learning on Non-IID Data Silos: An Experimental Study
CoRR(2024)
摘要
The innovative Federated Multi-Task Learning (FMTL) approach consolidates the
benefits of Federated Learning (FL) and Multi-Task Learning (MTL), enabling
collaborative model training on multi-task learning datasets. However, a
comprehensive evaluation method, integrating the unique features of both FL and
MTL, is currently absent in the field. This paper fills this void by
introducing a novel framework, FMTL-Bench, for systematic evaluation of the
FMTL paradigm. This benchmark covers various aspects at the data, model, and
optimization algorithm levels, and comprises seven sets of comparative
experiments, encapsulating a wide array of non-independent and identically
distributed (Non-IID) data partitioning scenarios. We propose a systematic
process for comparing baselines of diverse indicators and conduct a case study
on communication expenditure, time, and energy consumption. Through our
exhaustive experiments, we aim to provide valuable insights into the strengths
and limitations of existing baseline methods, contributing to the ongoing
discourse on optimal FMTL application in practical scenarios. The source code
will be made available for results replication.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要