I. Introduction
The problem of fusing a high spatial and low spectral resolution image with an auxiliary image of higher spectral but lower spatial resolution, also known as multi-resolution image fusion, has been explored for many years [2]. When considering remotely sensed images, an archetypal fusion task is the pansharpening, which generally consists of fusing a high spatial resolution panchromatic (PAN) image and low spatial resolution multispectral (MS) image. Pansharpening has been addressed in the literature for several decades and still remains an active topic [2]–[4]. More recently, hyperspectral (HS) imaging, which consists of acquiring a same scene in several hundreds of contiguous spectral bands, has opened a new range of relevant applications, such as target detection, classification and spectral unmixing [5]. The visualization of HS images is also interesting to be explored [6]. Naturally, to take advantage of the newest benefits offered by HS images, the problem of fusing HS and PAN images has received some attention in the literature [7]–[9]. Capitalizing on decades of experience in MS pansharpening, most of the HS pansharpening approaches merely adapt existing algorithms for PAN and MS fusion [10], [11]. Other methods are specifically designed to the HS pansharpening problem (see, e.g., [8], [12], [13]). Conversely, the fusion of MS and HS images has been considered in fewer research works and is still a challenging problem because of the high dimensionality of the data to be processed. Indeed, the fusion of MS and HS differs from traditional MS or HS pansharpening by the fact that more spatial and spectral information is contained in multi-band images. This additional information can be exploited to obtain a high spatial and spectral resolution image. In practice, the spectral bands of panchromatic images always cover the visible and infra-red spectra. However, in several practical applications, the spectrum of MS data includes additional high-frequency spectral bands. For instance the MS data of WorldView-3
http://www.satimagingcorp.com/satellite-sensors/WorldView3-DS-WV3-Web.pdf.
have spectral bands in the intervals [400–1750] nm and [2145–2365] nm whereas the PAN data are in the range [450–800] nm. Another interesting example is the HS+MS suite (called hyperspectral imager suite (HISUI)) that has been developed by the Japanese ministry of economy, trade, and industry (METI) [14]. HISUI is the Japanese next-generation Earth-observing sensor composed of HS and MS imagers and will be launched by the H-IIA rocket in 2015 or later as one of mission instruments onboard JAXA's ALOS-3 satellite. Some research activities have already been conducted for this practical multi-band fusion problem [15]. However, a lot of pansharpening methods, such as component substitution [2], relative spectral contribution [16] and high-frequency injection [17] are inapplicable or inefficient for the fusion problem. To address the challenge raised by the high dimensionality of the data to be fused, innovative methods need to be developed, which is the main objective of this paper.