I. Introduction
CIRCUITS that convert the voltage applied to their input terminals into output current are generally referred to as transconductors or V–I converters. These circuits are basic building blocks in many applications. The precision of the V–I conversion, with respect to the introduced distortion and noise, can be a limiting factor for the precision of the overall signal processing. There are numerous works that treat the improvement of the V–I conversion, in different application contexts and via different techniques [1]–[6]. In some recent, high performance designs [7], [8] a combination of linearization techniques is used. In this paper, first, an approach for a generalized mathematical treatment of the V–I function is suggested. It explains and confirms the potential of the combined linearization techniques to achieve better linearity. Second, a new circuit solution based on MOS differential pairs is described as an example of the implementation of that approach. Then the realization of this circuit is shown and the performance-limiting factors are discussed.