【Author】
Liu, Lifeng; Hu, Yifan; Yu, Jiawei; Zhang, Fengda; Huang, Gang; Xiao, Jun; Wu, Chao
【Source】ICVISP 2019: PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING
【Abstract】Currently, training neural networks often requires a large corpus of data from multiple parties. However, data owners are reluctant to share their sensitive data to third parties for modelling in many cases. Therefore, Federated Learning (FL) has arisen as an alternative to enable collaborative training of models without sharing raw data, by distributing modelling tasks to multiple data owners. Based on FL, we premodel sent a novel and decentralized approach to training encrypted models with privacy-preserved data on Blockchain. In our approach, Blockchain is adopted as the machine learning environment where different actors (i.e., the model provider, the data provider) collaborate on the training task. During the training process, an encryption algorithm is used to protect the privacy of data and the trained model. Our experiments demonstrate that our approach is practical in real-world applications.
【Keywords】Blockchain; Machine; Learning Distributed; Learning Privacy
【摘要】目前,训练神经网络通常需要来自多方的大量数据。然而,在许多情况下,数据所有者不愿意将他们的敏感数据分享给第三方进行建模。因此,通过将建模任务分配给多个数据所有者,联邦学习 (FL) 已成为一种替代方案,可以在不共享原始数据的情况下实现模型的协作训练。基于 FL,我们预模型发送了一种新颖的去中心化方法来训练具有区块链上隐私保护数据的加密模型。在我们的方法中,区块链被用作机器学习环境,不同的参与者(即模型提供者、数据提供者)在训练任务上进行协作。在训练过程中,使用加密算法来保护数据和训练模型的隐私。我们的实验表明,我们的方法在实际应用中是实用的。
评论