January 2025
·
1 Read
IEEE Transactions on Mobile Computing
In the evolution of the Internet of vehicles (IoV), the increasing demand for vehicular computation tasks presents significant challenges, particularly in the context of constrained local computation resources and high processing delays. To mitigate these challenges, multi-access edge computing (MEC) offers a potential solution by leveraging edge servers for lowlatency processing. However, it also encounters issues such as sub-channel competition and workload imbalance owing to the uneven distribution of vehicle densities. This paper introduces a novel IoV architecture that incorporates multi-task and multi-roadside unit (RSU) capabilities, enabling edge-toedge collaboration for efficient task offloading among RSUs. The optimization problem is formulated with the objective of minimizing the overall task delay, which is further divided into two sub-problems: communication resource allocation and load balancing. Considering the non-deterministic polynomial (NP)- hard nature of these sub-problems, we propose a two-stage deep reinforcement learning-based communication resource allocation and load balancing (DRLCL) algorithm to address them sequentially. Based on realistic vehicle trajectories, comprehensive evaluation results demonstrate the superiority of the proposed algorithm in reducing system delay compared to existing stateof-the-art baselines, offering an effective approach for optimizing the performance of vehicular edge computing (VEC) networks.