【Author】
Li, Zhidu; Zhou, Yujie; Wu, Dapeng; Wang, Ruyan
【Source】2021 IEEE INTERNATIONAL CONFERENCE ON BLOCKCHAIN (BLOCKCHAIN 2021)
【Abstract】Federated learning (FL) has been considered as a promising distributed learning tool in massive data mining for different local devices. Addressing in the trust risk of centralized model aggregation and the challenge of data heterogeneity in traditional FL, this paper proposes an enhancement FL approach in a blockchain network. By analyzing the shortcakes of the classic FL that is widely used in the blockchain enabled FL networks, we propose a novel local parameter update approach, where the information of the last-round global model is utilized to reduce the local performance drift caused by data heterogeneity. The convergence of the proposed FL approach is then proved and the convergence rate is revealed to be linear to the training time. Finally, extensive experiments are carried out with a public dataset to validate the effectiveness of the proposed approach with comparisons of two classic baseline approaches.
【Keywords】Federated learning; data heterogeneity; local model update; convergence performance
【标题】支持区块链的联邦学习的本地模型更新:方法和分析
【摘要】联邦学习(FL)被认为是一种有前途的分布式学习工具,可用于不同本地设备的海量数据挖掘。针对中心化模型聚合的信任风险和传统 FL 中数据异构性的挑战,本文提出了一种区块链网络中的增强 FL 方法。通过分析在区块链启用的 FL 网络中广泛使用的经典 FL 的缺点,我们提出了一种新颖的局部参数更新方法,该方法利用最后一轮全局模型的信息来减少由数据异质性引起的局部性能漂移.然后证明了所提出的 FL 方法的收敛性,并且收敛速度与训练时间呈线性关系。最后,使用公共数据集进行了广泛的实验,以通过比较两种经典基线方法来验证所提出方法的有效性。
【关键词】联邦学习;数据异质性;本地模型更新;收敛性能
评论