I. Introduction
Nitride-Based localized trapping storage Flash memory [1], [2] has been demonstrated to be a promising candidate for nor-type applications because of its simpler process, smaller bit size, and absence of floating-gate coupling effect. In spite of its many advantages, gate length scaling remains a challenge for the nor operation due to the need of high drain voltage during channel hot electron (CHE) programming. Moreover, several unique reliability issues, including second-bit effect (2nd bit effect) [3], [4] and secondary hot electron induced program disturbance, have also been characterized in a virtual-ground array [3], [5]. As demonstrated in [6] and [7], source/drain (S/D) engineering, such as adjusting pocket implantation [6] and reducing junction dosage [7], is effective to ameliorate cell punch-through. To improve the 2nd bit effect, the depletion region under a fixed drain bias should be as large as possible [1], [8]. To alleviate the X-disturbance, deeper S/D junctions are favored [5], [9]. Moreover, random telegraph noise (RTN) is reported to correlate with the local variation of channel potential [10]. Optimization of cell implantations by compromising these considerations is necessary. In this paper, gate-induced drain leakage (GIDL), which is sensitive to the variation of gate edge fields [11], [12], is employed to monitor both S/D dosage and the magnitude of programmed charges above the S/D junctions. Thus, the corresponding changes in cell performances, such as program speed, 2nd bit effect, and program disturbance, can be characterized. Finally, the dosage effect on RTN is also included.