【Author】 Fang, Chen; Guo, Yuanbo; Ma, Jiali; Xie, Haodong; Wang, Yifeng
【Source】COMPUTER COMMUNICATIONS
【Abstract】As a novel distributed learning mechanism, federated learning has drawn widespread attention by allowing multiple parties to train an accurate model collaboratively without collecting their raw data. However, it relies on a trustworthy central server that still suffers from severe security challenges, such as model inversion attack and single point of failure. Thereby, a privacy-preserving and verifiable federated learning method based on blockchain is proposed to enable fully decentralized and reliable federated learning in untrusted network. In our scheme, we propose a secure aggregation protocol to guarantee the confidentiality of gradients while supporting clients dropping out during the workflow, and design a novel blockchain structure enabling global gradient verification to defend against potential tampering attack. In addition, a gradient compression method is proposed to reduce the communication overhead. Security analysis shows that our scheme can preserve the privacy by adding pairwise random masks to the gradients, and prevent Sybil attack by reasonable threshold setting in verifiable secret sharing. Experimental results on two real-world datasets illustrate that when the clients' dropout rate is less than 20%, our scheme can achieve almost the same accuracy as original federated learning, and performs better than similar blockchain-based federated learning methods in terms of computation overhead and communication overhead.
【Keywords】Federated learning; Blockchain; Verifiable secret sharing; Polynomial commitment; Tampering attack
【标题】一种基于的隐私保护和可验证的联邦学习方法
【发表时间】2022
【收录时间】2022-08-08
【文献类型】Article
【论文大主题】区块链联邦学习
【影响因子】5.047
【翻译者】石东瑛
评论