I. Introduction and Motivation
Fifth Generation (5G) and Beyond networks aim to achieve a peak data rate from tens of Gigabits per second (Gbps) to one Terabit per second (Tbps). High-frequency spectrum bands such as mmWave and Terahertz are well recognized as the enabling technology to attain the Key Performance Indicators (KPIs) of next-generation networks. However, the propagation properties of such channels reveal the limitations like high propagation loss and blockage by trees, rain and humans [1]. Some recent studies [2], [3] have highlighted that the extreme fluctuations of the radio link quality yield a degraded goodput, high latency and low utilization of the resources. In addition, the radio layer events such as PDCP/RLC re-establishment, BSR/SR and HARQ/ARQ events, caused by mobility, handovers and frequent link-level errors impact the performance of the transport layer. The frequency of these disruptions would increase as we transition towards Non-Terrestrial Networks (NTN) [4], as depicted in Fig. 1. Based on our experiments and literature survey, Table I lists the radio-layer procedures that impact the goodput achieved and the End-to-End (E2E) delay.