I. Introduction
Interconnect parasitics are the dominant source of on-chip circuit delays for modern VLSI technologies. To efficiently account for wire delays in the design process, interconnect modeling, particularly the model order reduction of passive linear networks, has been an active topic of research in computer-aided design (CAD) community for more than a decade (e.g., [1]–[4]). As modern VLSI technologies approach the nanoscale manufacturing regime, it is becoming increasingly difficult to control systematic/random fluctuations introduced in the fabrication process, leading to growing variations in the critical dimensions and material properties of metal and dielectric layers [5], [6]. These process variations inevitably introduce performance variations in interconnects, which must be fully accounted for during timing verification. In the past, a significant amount of work has emerged to address interconnect variability via various avenues through variational interconnect model order reduction [7]–[13], or via statistical/variational interconnect analysis [11], [14], [15].