【Author】
Kim, Hyesung; Park, Jihong; Bennis, Mehdi; Kim, Seong-Lyun
【Source】IEEE COMMUNICATIONS LETTERS
【Abstract】By leveraging blockchain, this letter proposes a blockchained federated learning (BlockFL) architecture where local learning model updates are exchanged and verified. This enables on-device machine learning without any centralized training data or coordination by utilizing a consensus mechanism in blockchain. Moreover, we analyze an end-to-end latency model of BlockFL and characterize the optimal block generation rate by considering communication, computation, and consensus delays.
【Keywords】Computational modeling; Blockchain; Training; Servers; Nickel; Delays; Data models; On-device machine learning; federated learning; blockchain; latency
【摘要】通过利用区块链,这封信提出了一种区块链联邦学习 (BlockFL) 架构,在该架构中可以交换和验证本地学习模型更新。通过利用区块链中的共识机制,这使得设备上的机器学习无需任何集中的训练数据或协调。此外,我们分析了 BlockFL 的端到端延迟模型,并通过考虑通信、计算和共识延迟来表征最佳块生成率。
【关键词】计算建模;区块链;训练;服务器;镍;延误;数据模型;设备端机器学习;联邦学习;区块链;潜伏
评论