Communication-efficient Distributed Learning in V2X Networks: Parameter Selection and Quantization
【Author】 Barbieri, Luca; Savazzi, Stefano; Nicoli, Monica
【Source】2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022)
【影响因子】
【Abstract】In recent years, automotive systems have been integrating Federated Learning (FL) tools to provide enhanced driving functionalities, exploiting sensor data at connected vehicles to cooperatively learn assistance information for safety and maneuvering systems. Conventional FL policies require a central coordinator, namely a Parameter Server (PS), to orchestrate the learning process which limits the scalability and robustness of the training platform. Consensus-driven FL methods, on the other hand, enable fully decentralized learning implementations where vehicles mutually share the Machine Learning (ML) model parameters, possibly via Vehicle-to-Everything (V2X) networking, at the expense of larger communication resource consumption compared to vanilla FL approaches. This paper proposes a communication-efficient consensus-driven FL design tailored for the training of Deep Neural Networks (DNN) in vehicular networks. The vehicles taking part in the FL process independently select a pre-determined percentage of model parameters to be quantized and exchanged on each training round. The proposed technique is validated on a cooperative sensing use case where vehicles rely on Lidar point clouds to detect possible road objects/users in their surroundings via DNN. The validation considers latency, accuracy and communication efficiency trade-offs. Experimental results highlight the impact of parameter selection and quantization on the communication overhead in varying settings.
【Keywords】Federated Learning; Connected automated driving; V2X; Artificial Intelligence; Distributed processing
【发表时间】2022
【收录时间】2023-05-04
【文献类型】实验仿真
【主题类别】
区块链技术-协同技术-联邦学习
评论