Loading [MathJax]/extensions/MathMenu.js
A Cost-Aware Operator Migration Approach for Distributed Stream Processing System | IEEE Journals & Magazine | IEEE Xplore

A Cost-Aware Operator Migration Approach for Distributed Stream Processing System


Abstract:

Stream processing is integral to edge computing due to its low-latency attributes. Nevertheless, variability in user group sizes and disparate computing capabilities of e...Show More

Abstract:

Stream processing is integral to edge computing due to its low-latency attributes. Nevertheless, variability in user group sizes and disparate computing capabilities of edge devices necessitate frequent operator migrations within the stream. Moreover, intricate dependencies among stream operators often obscure the detection of potential bottleneck operators until an identified bottleneck is migrated in the stream. To address this, we propose a Cost-Aware Operator Migration (CAOM) scheme. The CAOM scheme incorporates a bottleneck operator detection mechanism that directly identifies all bottleneck operators based on task running metrics. This approach avoids multiple consecutive operator migrations in complex tasks, reducing the number of task interruptions caused by operator migration. Moreover, CAOM takes into account the temporal variance in operator migration costs. By factoring in the fluctuating data generation rate from data sources at different time intervals, CAOM selects the optimal start time for operator migration to minimize the amount of accumulated data during task interruptions. Finally, we implemented CAOM on Apache Flink and evaluated its performance using the WordCount and Nexmark applications. Our experiments show that CAOM effectively reduces the number of necessary operator migrations in tasks with complex topologies and decreases the latency overhead associated with operator migration compared to state-of-the-art schemes.
Published in: IEEE Transactions on Cloud Computing ( Volume: 13, Issue: 1, Jan.-March 2025)
Page(s): 441 - 454
Date of Publication: 04 February 2025

ISSN Information:

Funding Agency:


I. Introduction

The rise of IoT devices and the proliferation of edge devices have led to the emergence of edge computing. Such approaches benefit from being closer to the source users, significantly reducing latency. Additionally, it can leverage cloud-edge collaborative computing to deliver substantial processing power. As a result, edge computing is increasingly utilized for stream processing applications that require low latency, such as real-time log analysis [1], [2], [3], risk detection [4], [5], and autonomous driving [6], [7], [8].

Contact IEEE to Subscribe

References

References is not available for this document.