Loading [a11y]/accessibility-menu.js
CMOS 3D image sensor based on pulse modulated time-of-flight principle and intrinsic lateral drift-field photodiode pixels | IEEE Conference Publication | IEEE Xplore

CMOS 3D image sensor based on pulse modulated time-of-flight principle and intrinsic lateral drift-field photodiode pixels


Abstract:

Design and measurement results of a CMOS 128 × 96 pixel sensor are presented, which can be used for three-dimensional (3D) scene reconstruction applications based on indi...Show More

Abstract:

Design and measurement results of a CMOS 128 × 96 pixel sensor are presented, which can be used for three-dimensional (3D) scene reconstruction applications based on indirect time-of-flight (ToF) principle enabled by pulse modulated active laser illumination. The 40μm pitch pixels are based on the novel intrinsic lateral drift-field photodiode (LDPD) that allows for a 30ns complete charge transfer from the photoactive area into the readout node, and accumulation of signal charge over several readout cycles for extended signal-to-noise ratio (SNR). Distance measurements have been performed using a specially developed camera system.
Date of Conference: 12-16 September 2011
Date Added to IEEE Xplore: 13 October 2011
ISBN Information:

ISSN Information:

Conference Location: Helsinki, Finland

I. Introduction

Scannerless 3D scene reconstruction based on indirect time-of-flight (ToF) principle relies on measurements of the time elapsed between the moment in which a light signal, actively modulated and widened by a special diffusing lens to cover an entire scene, is sent by a light source and a moment at which, after being reflected by different objects in a scene, it impinges at the photosensor usually collocated aside this emitting light source. The photosensor is here operated synchronously with the emission of the light pulse and enables the evaluation of the exact delay of the returned pulsed signal impinging on each individual pixel and the determination of the distance of different objects in the illuminated scene. Due to its non ambiguity, this measurement principle allows for operation ranges of a few centimeters to tens of meters [1]. The indirect ToF measurement utilizes the integration of the photogenerated charge during an ultra short shutter time [2]. For centimetre accuracies, nevertheless, a nanosecond time discrimination capability, high detection speed, low noise, and high signal-to-noise ratio (SNR) are required [3]. For example, according to Eq. (1) [2], where is the distance between the sensor and a defined object in the scene, is the velocity of light and is the length of the pulse emitted by the light source, for a maximum distance of 4.5m a pulse width pulse, width of 30ns is required. The problem of so short time scales are the short integration times of the photogenerated charge, which requires short transit time, low noise and the fast readout in each pixel [4], [5], [6], [7]. $${d_{max}}={c \over 2} \cdot {T_{pulse}}{\hbox{(1)}}$$ Technology cross-section of the LDPD used in the presented 3D image sensor (not to scale)

Contact IEEE to Subscribe

References

References is not available for this document.