I. Introduction
Mobile receivers supporting millimeter-wave (mmWave) standards, e.g., 802.11ad WiGig, [1], [2] require multi-GS/s, medium resolution ADCs. Achieving high power-efficiency in the ADC can be challenging since, in practice, it is located on the same die as the baseband signal processor, which is almost always fabricated in the low-power (LP) flavor of CMOS. Time interleaved successive approximation register (TI-SAR) ADCs [3], [4] are well-suited to this task due to their digital-friendly nature and the continued improvements in capacitance (for fixed C) matching and density with CMOS scaling. However, time-interleaving causes performance impairments in TI-SAR ADCs; their mitigation necessitates long and elaborate digitally-assisted calibration schemes [3], [5]. Although very high performance [5] can be achieved with extensive background/foreground calibration [6], especially in the general purpose (GP) flavor of CMOS, implementation of such ADCs in a practical receiver has not been reported, to the authors' best knowledge. A typical receiver is shown in Fig. 1 in which almost all blocks require tuning and/or calibration to maximize the link quality in real time. The necessity of frequent calibration/tuning by drifts in channel/environmental conditions, on-chip implementation challenges including special inputs, complex algorithms, excess calibration power, other requirements (e.g., pilot length) [7], fast transceiver start-up and duty-cycling for power reduction are some of the major reasons against facilitating extensive calibration on each unit block, including the ADC.
Typical 802.11ad receiver block diagram.