22 November 2016 Intensity–hue–saturation-based image fusion using iterative linear regression
Mufit Cetin, Abdulkadir Tepecik
Author Affiliations +
Abstract
The image fusion process basically produces a high-resolution image by combining the superior features of a low-resolution spatial image and a high-resolution panchromatic image. Despite its common usage due to its fast computing capability and high sharpening ability, the intensity–hue–saturation (IHS) fusion method may cause some color distortions, especially when a large number of gray value differences exist among the images to be combined. This paper proposes a spatially adaptive IHS (SA-IHS) technique to avoid these distortions by automatically adjusting the exact spatial information to be injected into the multispectral image during the fusion process. The SA-IHS method essentially suppresses the effects of those pixels that cause the spectral distortions by assigning weaker weights to them and avoiding a large number of redundancies on the fused image. The experimental database consists of IKONOS images, and the experimental results both visually and statistically prove the enhancement of the proposed algorithm when compared with the several other IHS-like methods such as IHS, generalized IHS, fast IHS, and generalized adaptive IHS.
© 2016 Society of Photo-Optical Instrumentation Engineers (SPIE) 1931-3195/2016/$25.00 © 2016 SPIE
Mufit Cetin and Abdulkadir Tepecik "Intensity–hue–saturation-based image fusion using iterative linear regression," Journal of Applied Remote Sensing 10(4), 045019 (22 November 2016). https://doi.org/10.1117/1.JRS.10.045019
Published: 22 November 2016
Lens.org Logo
CITATIONS
Cited by 7 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Image quality

Sensors

Image enhancement

Image processing

Earth observing sensors

Visualization

Back to Top