PoFEL: Energy-efficient Consensus for Blockchain-based Hierarchical Federated Learning.

Shengyang Li,Qin Hu,Zhilin Wang

CoRR(2023)

引用 0|浏览0
暂无评分
摘要
Facilitated by mobile edge computing, client-edge-cloud hierarchical federated learning (HFL) enables communication-efficient model training in a widespread area but also incurs additional security and privacy challenges from intermediate model aggregations and remains the single point of failure issue. To tackle these challenges, we propose a blockchain-based HFL (BHFL) system that operates a permissioned blockchain among edge servers for model aggregation without the need for a centralized cloud server. The employment of blockchain, however, introduces additional overhead. To enable a compact and efficient workflow, we design a novel lightweight consensus algorithm, named Proof of Federated Edge Learning (PoFEL), to recycle the energy consumed for local model training. Specifically, the leader node is selected by evaluating the intermediate FEL models from all edge servers instead of other energy-wasting but meaningless calculations. This design thus improves the system efficiency compared with traditional BHFL frameworks. To prevent model plagiarism and bribery voting during the consensus process, we propose Hash-based Commitment and Digital Signature (HCDS) and Bayesian Truth Serum-based Voting (BTSV) schemes. Finally, we devise an incentive mechanism to motivate continuous contributions from clients to the learning task. Experimental results demonstrate that our proposed BHFL system with the corresponding consensus protocol and incentive mechanism achieves effectiveness, low computational cost, and fairness.
更多
查看译文
关键词
consensus,energy-efficient,blockchain-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要