【Author】
Zhao, Jian; Wu, Xin; Zhang, Yan; Wu, Yu; Wang, Zhi
【Source】ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT II
【Abstract】Based on the concept of letting training organizations only exchange their partial gradients instead of the proprietary datasets owned by them, federated learning has become a promising approach for organizations to train deep learning models collaboratively. However, conventional federated learning based on a centralized parameter server is susceptible to recovery attacks, in which the original data can be recovered if the attacker can collect enough gradients from the organizations. To solve the problem, we first propose a blockchain-based decentralized model training architecture for federated learning, which is more robust than the centralized architecture. Based on this architecture, we develop a joint efficiency and randomness aware gradient aggregation approach. Our real-world experiments show that our design is not affected by a single point of failure. Moreover, it can increase the model accuracy of the participating organization, while mitigating the data privacy disclosure risk and improving the gradient aggregation performance.
【Keywords】Federated learning; Blockchain; Smart contract
【摘要】基于让培训组织只交换他们的部分梯度而不是他们拥有的专有数据集的概念,联邦学习已成为组织协作训练深度学习模型的一种有前途的方法。然而,基于集中参数服务器的传统联邦学习容易受到恢复攻击,如果攻击者能够从组织中收集足够的梯度,则可以恢复原始数据。为了解决这个问题,我们首先提出了一种基于区块链的联邦学习去中心化模型训练架构,它比集中式架构更健壮。基于这种架构,我们开发了一种联合效率和随机性感知梯度聚合方法。我们的实际实验表明,我们的设计不受单点故障的影响。此外,它可以提高参与组织的模型准确性,同时降低数据隐私泄露风险并提高梯度聚合性能。
评论