I. Introduction
Nowadays, we have witnessed significant growth in a number of smart objects (or “things”) connected to the Internet to interact and cooperate with each other, called the Internet of Things (IoT). Such a trend poses a significant challenge in achieving efficient sharing of computational/communication resources while ensuring safety-stringent timing constraints, primarily to achieve fault isolation/containment, which is also a key to design a reliable cyber-physical system, an emerging system of systems often considered as real-time IoT. To provide end-to-end timing guarantees, resource utilization estimates are required for applications; however, conservative worst-case execution time (WCET) estimates have conventionally been used for safety-critical applications, leading to a severely under-utilized system in practice. For example, a task typically exhibits a certain variation of execution times depending on the input data and different behavior of the environment. The exact value of WCET is usually unknown and can be overly estimated [1]. A task can also experience a large variation of interference from others depending on scheduling policies and execution scenarios. Though it is feasible to estimate the amount of interference tightly in some environments, it is difficult (often computationally intractable) to calculate it accurately in many complex environments such as distributed systems [2], [3].