I. Introduction
With the high development of vehicular networks, vehicular applications generate a large amount of data-hungry and delay-sensitive tasks, including traffic flow prediction and automatic driving. Nowadays, mobile edge computing (MEC) has become a promising technology in the Internet of Vehicle (10 V) [1], [2]. Therefore, multiple vehicles can offload their tasks to MEC servers such as base stations or roadside units. Accordingly, the great potential of MEC has drawn lots of attention [3]. In this field of research, energy consumption is a crucial concern because MEC involves the transmission energy consumption in the offloading process. In [3], the authors propose a reinforcement learning (RL) based scheme to get a low communication overhead in the MEC system.