I. Introduction
Visual motion is one of the sources of sensory information used on small robotic platforms. Power consumption, size, weight, and ability of on-board sensors to operate under natural conditions with uncontrolled illumination are important specifications for such platforms. The use of optical sensors to derive motion vectors which are then used as part of the control of small robotic platforms and micro air vehicles (MAVs) has been demonstrated in [1]–[4]. The MAV in [1], for example, used two one-dimensional linear CMOS cameras; one in front of the flyer pointing forward and the other below the flyer pointing downwards. The motion information extracted from the visual scene is used for estimating altitude from the down-looking sensor and collision avoidance from the forward-looking camera. Motion is extracted using a 1D image–interpolation algorithm [5].