A blockchain-based audit approach for encrypted data in federated learning

Digital Communications and Networks(2022)

Cited 5|Views15
No score
Abstract
The development of data-driven artificial intelligence technology has given birth to a variety of big data applications. Data has become an essential factor to improve these applications. Federated learning, a privacy-preserving machine learning method, is proposed to leverage data from different data owners. It is typically used in conjunction with cryptographic methods, in which data owners train the global model by sharing encrypted model updates. However, data encryption makes it difficult to identify the quality of these model updates. Malicious data owners may launch attacks such as data poisoning and free-riding. To defend against such attacks, it is necessary to find an approach to audit encrypted model updates. In this paper, we propose a blockchain-based audit approach for encrypted gradients. It uses a behavior chain to record the encrypted gradients from data owners, and an audit chain to evaluate the gradients’ quality. Specifically, we propose a privacy-preserving homomorphic noise mechanism in which the noise of each gradient sums to zero after aggregation, ensuring the availability of aggregated gradient. In addition, we design a joint audit algorithm that can locate malicious data owners without decrypting individual gradients. Through security analysis and experimental evaluation, we demonstrate that our approach can defend against malicious gradient attacks in federated learning.
More
Translated text
Key words
Audit,Data quality,Blockchain,Secure aggregation,Federated learning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined