【Author】 Cheng, Runze; Sun, Yao; Liu, Yijing; Xia, Le; Sun, Sanshan; Imran, Muhammad Ali
【Source】2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)
【Abstract】Cache-enabled device-to-device (D2D) communication has been widely deemed as a promising approach to tackle the unprecedented growth of wireless traffic demands. Recently, tremendous efforts have been put into designing an efficient caching policy to provide users better quality of service. However, public concerns of data privacy still remain in D2D cache sharing networks, which thus arises an urgent need for a privacy-preserved caching scheme. In this study, we propose a double-layer blockchain-based federated learning (DBFL) scheme with the aim of minimizing the download latency for all users in a privacy-preserving manner. Specifically, in the sublayer, the devices within the same coverage area run a federated learning (FL) to train the caching scheme model for each area separately without exchange of local data. The model parameters for each area are recorded in sublayer chains with Raft consensus mechanism. Meanwhile, in the main layer, a mainchain based on practical Byzantine fault tolerance (PBFT) mechanism is used to resist faults and attacks, thus securing the reliability of FL updates. Only the reliable area models authorized by the mainchain are utilized to update the global model in the main layer. Numerical results show the convergence, as well as the gain of download latency of the proposed DBFL caching scheme when compared with several traditional schemes.
【Keywords】D2D Caching; Federated Learning; Blockchain
【标题】由支持区块链的联邦学习支持的隐私保护 D2D 缓存方案
【摘要】支持缓存的设备到设备 (D2D) 通信已被广泛认为是解决无线流量需求空前增长的一种有前途的方法。最近,人们在设计一种有效的缓存策略以向用户提供更好的服务质量方面付出了巨大的努力。然而,公众对数据隐私的担忧仍然存在于 D2D 缓存共享网络中,因此迫切需要一种隐私保护的缓存方案。在这项研究中,我们提出了一种基于区块链的双层联邦学习 (DBFL) 方案,旨在以保护隐私的方式最小化所有用户的下载延迟。具体来说,在子层中,同一覆盖区域内的设备运行联邦学习(FL)来分别训练每个区域的缓存方案模型,而不交换本地数据。每个区域的模型参数记录在具有 Raft 共识机制的子层链中。同时,在主层,采用基于实用拜占庭容错(PBFT)机制的主链来抵御故障和攻击,从而保证了FL更新的可靠性。只有主链授权的可靠区域模型用于更新主层的全局模型。数值结果显示了与几种传统方案相比,所提出的 DBFL 缓存方案的收敛性以及下载延迟的增益。
【关键词】D2D 缓存;联邦学习;区块链
【发表时间】2021
【收录时间】2022-07-06
【文献类型】Proceedings Paper
【论文大主题】区块链联邦学习
【论文小主题】联邦学习为主体
【翻译者】石东瑛
评论