I. Introduction
The existence of a tradeoff between spatial and spectral resolution in remote sensing spaceborne imagery is well known. That is due to a combination of a set of observational constraints imposed by the acquisition system, detector specifications and satellite motion, among others. As a result, spaceborne imagery is usually offered to the community as two separate products: a high-resolution panchromatic (HRP) image and a low-resolution multispectral (LRM) image. In addition, an increasing number of applications, such as feature detection, change monitoring and land cover classification, often demand the highest spatial and spectral resolution for the best accomplishment of their objectives. In response to those needs, image fusion has become a powerful solution providing a single image with simultaneously the multispectral content of the original LRM image and an enhanced spatial resolution.