Federated Domain Adaptation via Transformer for Multi-Site Alzheimer's Disease Diagnosis

IEEE TRANSACTIONS ON MEDICAL IMAGING(2023)

引用 1|浏览18
暂无评分
摘要
In multi-site studies of Alzheimer's disease (AD), the difference of data in multi-site datasets leads to the degraded performance of models in the target sites. The traditional domain adaptation method requires sharing data from both source and target domains, which will lead to data privacy issue. To solve it, federated learning is adopted as it can allow models to be trained with multi-site data in a privacy-protected manner. In this paper, we propose a multi-site federated domain adaptation framework via Transformer (FedDAvT), which not only protects data privacy, but also eliminates data heterogeneity. The Transformer network is used as the backbone network to extract the correlation between the multi-template region of interest features, which can capture the brain abundant information. The self-attention maps in the source and target domains are aligned by applying mean squared error for subdomain adaptation. Finally, we evaluate our method on the multi-site databases based on three AD datasets. The experimental results show that the proposed FedDAvT is quite effective, achieving accuracy rates of 88.75%, 69.51%, and 69.88% on the AD vs. NC, MCI vs. NC, and AD vs. MCI two-way classification tasks, respectively.
更多
查看译文
关键词
Transformers,Federated learning,Adaptation models,Feature extraction,Magnetic resonance imaging,Data privacy,Hospitals,Alzheimer's disease,domain adaptation,federated learning,transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要