Loading [MathJax]/extensions/MathMenu.js
Interless: Interference-Aware Deep Resource Prediction for Serverless Computing | IEEE Conference Publication | IEEE Xplore

Interless: Interference-Aware Deep Resource Prediction for Serverless Computing


Abstract:

Serverless is an emerging cloud computing paradigm that allows functions to share resources. However, function resource sharing introduces interference, which results in ...Show More

Abstract:

Serverless is an emerging cloud computing paradigm that allows functions to share resources. However, function resource sharing introduces interference, which results in performance degradation. Existing resource prediction approaches ignore the function instance placement and interference between functions. Thus, they cannot predict the resource finely. This paper proposes Interless, an interference-aware resource prediction system for serverless computing with a sequence-to-sequence neural network. The Interless’s encoder directly learns function instance interference by the TPA-LSTM module. TPA-LSTM can also capture historical request queuing for better prediction. Interless’s decoder contains a GRU module for long-time series prediction. Long-time prediction is essential for time reservation in function scheduling and warm-up. Moreover, long-time series prediction helps Interless identify system anomalies and cyber threats by comparing monitored and predicted resource consumption. We implement Interless on top of Docker Swarm as a serverless system for resource prediction. Experimental results demonstrate that Interless reduces the MAPE, RSE, and SMAPE of prediction by 64%, 58%, and 65%, respectively, compared to the state-of-the-arts.
Date of Conference: 25-27 May 2024
Date Added to IEEE Xplore: 17 July 2024
ISBN Information:

ISSN Information:

Conference Location: Xi'an, China

I. Introduction

Serverless is an emerging cloud paradigm that enables developers to run code in the cloud without managing cloud resources [1], [2]. Serverless provides a pay-as-you-go pricing model, ensuring that cloud users are charged solely for the actual execution time and resources utilized by their functions, as opposed to paying for idle capacity. However, cloud providers must estimate the resources of serverless functions and schedule them to support dynamic request loads. Due to the resource configuration and function scheduling delay, resource prediction is necessary for time reservation in function update [3]. Resource prediction also helps anomaly detection by comparing monitored and predicted resource consumption. Multiple function instances are deployed on the same server [4] to improve hardware resource utilization. However, function instances deployed on the same physical server compete for shared resources (eg. memory bandwidth and CPU caches, etc) [5] – [8] which leads to interference. According to our experiments in § II, interference can decrease the performance (i.e., requests per second) by more than 58%. Interference adds complexity to resource prediction.

Contact IEEE to Subscribe

References

References is not available for this document.