Paper
18 October 2005 Eight different fusion techniques for use with very high-resolution data
Author Affiliations +
Abstract
All the commercial satellites (SPOT, LANDSAT, IRS, IKONOS, Quickbird and Orbview) collect a high spatial resolution panchromatic image and multiple (usually four) multispectral images with significant lower spatial resolution. The PAN images are characterised by a very high spatial information content well-suited for intermediate scale mapping applications and urban analysis. The multispectral images provide the essential spectral information for smaller scale thematic mapping applications such as landuse surveys. Why don't most satellites collect high-resolution MS images directly, to meet this requirement for high-spatial and high-spectral resolutions? There is a limitation to the data volume that a satellite sensor can store on board and then transmit to ground receiving station. Usually the size of the panchromatic image is many times larger than the size of the multispectral images. The size of the panchromatic of Landsat ETM+ is four times greater than the size of a ETM+ multispectral image. The panchromatic image for IKONOS, Quickbird SPOT5 and Orbview is sixteen times larger than the respective multispectral images. As a result if a sensor collected high-resolution multispectral data it could acquire fewer images during every pass. Considering these limitations, it is clear that the most effective solution for providing high-spatial-resolution and high-spectral-resolution remote sensing images is to develop effective image fusion techniques. Image fusion is a technique used to integrate the geometric detail of a high-resolution panchromatic (Pan) image and the color information of a low-resolution multispectral (MS) image to produce a high-resolution MS image. During the last twenty years many methods such as Principal Component Analysis (PCA), Multiplicative Transform, Brovey Transform, IHS Transform have been developed producing good quality fused images. Despite the quite good optical results many research papers have reported the limitations of the above fusion techniques. The most significant problem is color distortion. Another common problem is that the fusion quality often depends upon the operator's fusion experience, and upon the data set being fused. No automatic solution has been achieved to consistently produce high quality fusion for different data sets. More recently new techniques have been proposed such as the Wavelet Transform, the Pansharp Transform and the Modified IHS Transform. Those techniques seem to reduce the color distortion problem and to keep the statistical parameters invariable. In this study we compare the efficiency of eight fusion techniques and more especially the efficiency of Multiplicative Brovey, IHS, Modified IHS, PCA, Pansharp, Wavelet and LMM (Local Mean Matching) fusion techniques for the fusion of Ikonos data. For each merged image we have examined the optical qualitative result and the statistical parameters of the histograms of the various frequency bands, especially the standard deviation All the fusion techniques improve the resolution and the optical result. The Pansharp, the Wavelet and the Modified IHS merging technique do not change at all the statistical parameters of the original images. These merging techniques are proposed if the researcher want to proceed to further processing using for example different vegetation indexes or to perform classification using the spectral signatures.
© (2005) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Konstantinos G. Nikolakopoulos "Eight different fusion techniques for use with very high-resolution data", Proc. SPIE 5982, Image and Signal Processing for Remote Sensing XI, 598205 (18 October 2005); https://doi.org/10.1117/12.626796
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Data fusion

RGB color model

Multispectral imaging

Wavelets

Principal component analysis

Earth observing sensors

Back to Top