I. Introduction
According to the International Technological Roadmap for Semiconductors (ITRS), gate oxide thicknesses of 1.2–1.5 nm will be required by 2004 for sub-100-nm CMOS [1]. In this thin gate oxide regime, direct tunneling current increases exponentially with decreasing oxide thickness [2], which is of primary concern for CMOS scaling. For conventional CMOS devices, the dominant leakage mechanism is mainly due to short-channel effects owing to drain-induced barrier lowering (DIBL). In the ultrathin gate oxide regime, however, the gate leakage current can contribute significantly to off-state leakage, which may result in faulty circuit operation since designers may assume that there is no appreciable gate current. Illustration of gate direct tunneling components of a very short-channel NMOSFET ( and ) are EDT currents. A recent study has shown that direct tunneling current appearing between the source–drain extension (SDE) and the gate overlap, so-called the edge direct tunneling (EDT), dominates off-state drive current, especially in very short channel devices [3], [4]. This results from the fact that the ratio of the gate overlap to the total channel length becomes large in the short-channel device compared to that of the long-channel device. Thus, the gate current effect is expected to become appreciable in ultrathin oxide, sub-100-nm MOS circuits. Even though many researchers have discussed the effects of gate leakage current, scaling limitations due to gate tunneling current from the viewpoint of circuit operation have not been critically addressed. Assessment of circuit immunity against the gate tunneling current, depending on various device structures and bias conditions, is of great importance in determining directions for future gate oxide scaling. This article considers circuit operation stability and oxide scaling limitations for several typical logic and nonlogic CMOS circuits using both device- and circuit-level simulation models.