【Author】
Feng, Lei; Zhao, Yiqi; Guo, Shaoyong; Qiu, Xuesong; Li, Wenjing; Yu, Peng
【Source】IEEE TRANSACTIONS ON COMPUTERS
【Abstract】As an emerging distributed machine learning (ML) method, federated learning (FL) can protect data privacy through collaborative learning of artificial intelligence (AI) models across a large number of devices. However, inefficiency and vulnerability to poisoning attacks have slowed FL performance. Therefore, a blockchain-based asynchronous federated learning (BAFL) framework is proposed to ensure the security and efficiency required by FL. The blockchain ensures that the model data cannot be tampered with while asynchronous learning speeds up global aggregation. A novel entropy weight method is used to evaluate the participating rank and proportion of the local model trained in BAFL of the devices. The energy consumption and local model update efficiency are balanced by adjusting the local training and communication delay and optimizing the block generation rate. The extensive evaluation results show that the proposed BAFL framework has higher efficiency and higher performance for preventing poisoning attacks than other distributed ML methods.
【Keywords】Blockchain; Training; Data models; Servers; Collaborative work; Computational modeling; Load modeling; Blockchain; federated learning; security; asynchronous learning; learning efficiency
【摘要】作为一种新兴的分布式机器学习 (ML) 方法,联邦学习 (FL) 可以通过跨大量设备的人工智能 (AI) 模型的协作学习来保护数据隐私。然而,低效率和易受中毒攻击的脆弱性降低了 FL 的性能。因此,提出了一种基于区块链的异步联邦学习(BAFL)框架,以确保 FL 所需的安全性和效率。区块链保证模型数据不被篡改,异步学习加速全局聚合。一种新颖的熵权方法用于评估设备在BAFL中训练的局部模型的参与等级和比例。通过调整局部训练和通信延迟,优化出块率来平衡能耗和局部模型更新效率。广泛的评估结果表明,所提出的 BAFL 框架在防止中毒攻击方面比其他分布式 ML 方法具有更高的效率和更高的性能。
【关键词】区块链;训练;数据模型;服务器;协作工作;计算建模;负载建模;区块链;联邦学习;安全;异步学习;学习效率
评论