【Author】 Ramanan, Paritosh; Nakayama, Kiyoshi
【Source】2020 IEEE INTERNATIONAL CONFERENCE ON BLOCKCHAIN (BLOCKCHAIN 2020)
【Abstract】A key aspect of Federated Learning (FL) is the requirement of a centralized aggregator to maintain and update the global model. However, in many cases orchestrating a centralized aggregator might be infeasible due to numerous operational constraints. In this paper, we introduce BAFFLE, an aggregator free, blockchain driven, FL environment that is inherently decentralized. BAFFLE leverages Smart Contracts (SC) to coordinate the round delineation, model aggregation and update tasks in FL. BAFFLE boosts computational performance by decomposing the global parameter space into distinct chunks followed by a score and bid strategy. In order to characterize the performance of BAFFLE, we conduct experiments on a private Ethereum network and use the centralized and aggregator driven methods as our benchmark. We show that BAFFLE significantly reduces the gas costs for FL on the blockchain as compared to a direct adaptation of the aggregator based method. Our results also show that BAFFLE achieves high scalability and computational efficiency while delivering similar accuracy as the benchmark methods.
【Keywords】Blockchain based decentralization; Aggregator Free Federated Learning; Ethereum driven Smart Contracts
【标题】BAFFLE:基于区块链的无聚合器联邦学习
【摘要】联邦学习 (FL) 的一个关键方面是需要集中聚合器来维护和更新全局模型。然而,在许多情况下,由于许多操作限制,编排集中式聚合器可能是不可行的。在本文中,我们介绍了 BAFFLE,这是一个无聚合器、区块链驱动的 FL 环境,本质上是去中心化的。 BAFFLE 利用智能合约 (SC) 来协调 FL 中的轮次描绘、模型聚合和更新任务。 BAFFLE 通过将全局参数空间分解为不同的块,然后是分数和出价策略来提高计算性能。为了表征 BAFFLE 的性能,我们在私有以太坊网络上进行实验,并使用集中式和聚合器驱动的方法作为我们的基准。我们表明,与直接采用基于聚合器的方法相比,BAFFLE 显着降低了区块链上 FL 的 gas 成本。我们的结果还表明,BAFFLE 实现了高可扩展性和计算效率,同时提供与基准方法相似的准确性。
【关键词】基于区块链的去中心化;聚合器免费联邦学习;以太坊驱动的智能合约
【发表时间】2020
【收录时间】2022-07-06
【文献类型】Proceedings Paper
【论文大主题】区块链联邦学习
【论文小主题】联邦学习为主体
【翻译者】石东瑛
评论