【Author】
Rahmadika, Sandi; Rhee, Kyung-Hyune
【Source】INTERNATIONAL JOURNAL OF AD HOC AND UBIQUITOUS COMPUTING
【Abstract】Federated learning (FL) permits a vast number of connected to construct deep learning models while keeping their private training data on the device. Rather than uploading the training data and model to the server, FL only sends the local gradients gradually. Hence, FL preserves data privacy by design. FL leverages a decentralised approach where the training data is no longer concentrated. Similarly, blockchain uses the same approach by providing a digital ledger that can cover the flaws in the centralised system. Motivated by the merits of a decentralised approach, we construct a collaborative model of simultaneous distributed learning by employing multiple computing devices over shared memory with blockchain smart contracts as a secure incentive mechanism. The collaborative model preserves a value-driven of distributed learning in enhancing users' privacy. It is supported by blockchain with a secure decentralised incentive technique without having a single point of failure. Furthermore, potential vulnerabilities and plausible defences are also outlined. The experimental results positively recommend that the collaborative model satisfies the design goals.
【Keywords】blockchain; decentralised revenue; decentralised training; federated learning; predictive model; user privacy
【标题】通过具有基于区块链的收入的去中心化预测模型增强数据隐私
【摘要】联邦学习 (FL) 允许大量连接来构建深度学习模型,同时将他们的私人训练数据保留在设备上。 FL 不会将训练数据和模型上传到服务器,而是逐渐发送局部梯度。因此,FL 通过设计来保护数据隐私。 FL 利用分散式方法,训练数据不再集中。类似地,区块链使用相同的方法,通过提供可以覆盖中心化系统缺陷的数字分类帐。受去中心化方法优点的启发,我们通过在共享内存上使用多个计算设备和区块链智能合约作为安全激励机制,构建了一个同时分布式学习的协作模型。协作模型在增强用户隐私方面保留了分布式学习的价值驱动。它由具有安全分散激励技术的区块链支持,没有单点故障。此外,还概述了潜在的漏洞和合理的防御措施。实验结果积极地表明协作模型满足设计目标。
【关键词】区块链;去中心化收入;分散培训;联邦学习;预测模型;用户隐私
评论