I. Introduction
Recently, the research on Internet of Things (IoT) applications has received increasing interests, driven by the advancements in computation, communication, and sensing technologies [1], [2], [3] and the proliferation of IoT devices (IDs). To facilitate various IoT applications, including traffic surveillance, autonomous driving, and environmental monitoring, it is necessary to sense a variety of valid real-time information about the physical environment and transform it into state updates for future system operations. Meanwhile, as a metric for measuring the data freshness in IoT, the Age of Information (AoI), which assesses the currency of data within IoT applications, has been extensively studied and is integral to the functioning of communication systems [4], [5], [6], [7], [8], [9], [10]. However, the processing of sensory data is often constrained by the computational and temporal limitations inherent in IDs, which typically have restricted computing power, storage capability and battery capacity. To address these constraints, the emergence of multiaccess edge computing (MEC) has provided a potential solution by bringing networking, storage and computation capabilities closer to the network edge [9], [11], [12], [13]. By leveraging the computational resources of MEC servers, IDs can offload their computational tasks for efficient execution. Nonetheless, the increasing number of devices within the next-generation IoT system has resulted in an upsurge of interactions, both among devices and between devices and MEC servers [7]. As a result, the joint optimization of sensing and computation strategies becomes extremely complex, presenting a significant challenge in MEC-assisted IoT systems.