Loading [MathJax]/extensions/MathZoom.js
A Novel Routing Protocol over Time-Varying IoT Network Using RTS/CTS Framework | IEEE Conference Publication | IEEE Xplore

A Novel Routing Protocol over Time-Varying IoT Network Using RTS/CTS Framework


Abstract:

In the context of 6G IoT networks, distributed computing has become a critical enabler for seamless communication across diverse sectors, such as smart manufacturing, int...Show More

Abstract:

In the context of 6G IoT networks, distributed computing has become a critical enabler for seamless communication across diverse sectors, such as smart manufacturing, intelligent transportation, healthcare, and defense. Many of these applications involve mobile IoT devices (IoDs) that change locations dynamically, requiring robust and efficient multi-hop data transmission frameworks. Selecting the optimal relay node in distributed 6G IoT networks is essential, as it directly impacts energy efficiency, latency, data interference, and throughput. Additionally, ensuring the availability of the relay node’s queue is crucial to prevent data loss. To address these challenges, this work introduces a novel distributed computing framework based on the Request-To-Send (RTS) and Clear-To-Send (CTS) protocols, specifically designed for time-varying 6G IoT networks. The framework incorporates an RTS/CTS mechanism to manage distributed data transfer and employs the Optimal Relay IoD Selection with Q-Learning (OptRISQL) algorithm to determine the optimal relay path dynamically. By distributing computation across nodes, the proposed approach enhances network flexibility, reduces queue sizes, and minimizes memory costs, leading to improved overall network efficiency. Simulation results demonstrate that this distributed computing method effectively reduces interference and enhances throughput, outperforming other current state-of-the-art techniques. Index Terms-6G Internet of Things (IoT), Distributed Computing, Request-To-Send (RTS), Clear-To-Send (CTS), Quality-of-Service (QoS).
Date of Conference: 15-18 December 2024
Date Added to IEEE Xplore: 03 March 2025
ISBN Information:

ISSN Information:

Conference Location: Guwahati, India

I. Introduction

The rapid advancement of 6 G IoT networks marks a new era in distributed computing, enabling unprecedented levels of connectivity and communication across various sectors, including smart manufacturing, intelligent transportation, healthcare, and defense [1], [2], [3], [4]. A key technology supporting these advancements is Low Power Wide Area Networks (LPWANs), known for their cost-efficiency and wide coverage. Among LPWANs, Long Range Wide Area Networks (LoRaWAN) have gained significant attention in IoT applications like smart agriculture, healthcare, and smart metering. LoRaWAN primarily enables one-hop communication between network nodes and gateways, which simplifies deployment but can also strain gateway capacity, especially as nodes independently transmit packets via direct links. This can increase power consumption and can lead to interference from other wireless networks, causing packet loss and reducing overall network throughput. To address these issues, distributed computing in 6G IoT networks enables neighboring nodes to act as relays, adopting a multi-hop approach that reduces congestion, minimizes power use, and improves reliability and performance. Various studies have focused on reducing energy consumption and network latency in multihop data transmission in LoRaWAN technology. The authors in [5] compiled a comprehensive list of works on multi-hop transmission in LoRaWAN technology. In [6], the authors discussed the impact of IoT in modern healthcare systems, such as fitness assistance, sleep monitoring, and daily diet tracking. The survey includes studying intelligent health monitoring systems by categorizing their sensor components using device-based and device-free techniques. In 6G IoT multi-hop networks, faulty nodes may reduce the network performance and lifetime. Therefore, predicting the faulty nodes before the data transmission may further enhance network lifetime and QoS. The authors in [7] have introduced a novel Q-learning framework that successfully predicts faulty nodes and avoids data transmission, thus improving network performance. The reference [8] discusses the problem of optimal relay selection in dynamic IoT networks in which the topology is not static. A new methodology named Optimal Relay IoD Selection Using Q-Learning (OptRISQL) is used in this work, aiming to improve network performance by dynamically selecting the best relay in the network for EE and QoS.

Contact IEEE to Subscribe

References

References is not available for this document.