I. Introduction
Conservation Voltage Reduction (CVR) is pivotal for optimizing energy consumption and addressing voltage-related challenges in power systems [1]. Assessing the impact of CVR on specific feeders is essential for utilities to pinpoint opportunities for optimal energy savings. Empirical tests have established a connection, revealing a 0.3%-1% reduction in load consumption for every 1% decrease in voltage, encapsulated by the CVR factor [2]. However, computing this factor is intricate. It is influenced by factors such as power grid configurations, load types, customer behaviors, and local weather, resulting in stochastic and time-variant loading conditions [3]. Furthermore, the growing integration of distributed generators, microgrids, electric vehicles, and demand response strategies makes the grid more active, complicating CVR factor computation [4]–[7]. Consequently, achieving a precise and robust evaluation of the CVR factor remains a significant challenge in effectively implementing CVR strategies.