The satellites cover different portions of the electromagnetic spectrum and record the incoming radiations at different spatial, temporal, and spectral resolutions. Most of these sensors operate in two modes: multispectral mode and the panchromatic mode.
The panchromatic mode corresponds to the observation over a broad spectral band (similar to a typical black and white photograph) and the multispectral (color) mode corresponds to the observation in a number of relatively narrower bands. For example in the IRS - 1D, LISS III operates in the multispectral mode. It records energy in the green (0.52 - 0.59 ^m), red (0.62-0.68 ^m), near infrared (0.77- 0.86 ^m) and mid-infrared (1.55 - 1.70 ^m). In the same satellite PAN operates in the panchromatic mode. SPOT is another satellite, which has a combination of sensor operating in the multispectral and panchromatic mode. Above information is also expressed by saying that the multispectral mode has a better spectral resolution than the panchromatic mode.
Now coming to the spatial resolution, most of the satellites are such that the panchromatic mode has a better spatial resolution than the multispectral mode, for e.g. in IRS -1C, PAN has a spatial resolution of 5.8 m whereas in the case of LISS it is 23.5 m. Better is the spatial resolution, more detailed information about a landuse is present in the imagery, hence usually PAN data is used for observing and separating various feature. Both theses type of sensors have their particular utility as per the need of user. If the need of the user is to separate two different kind of landuses, LISS III is used, whereas for a detailed map preparation of any area, PAN imagery is extremely useful.
Image Fusion is the combination of two or more different images to form a new image (by using a certain algorithm).
The commonly applied Image Fusion Techniques are
1. IHS Transformation
3. Brovey Transform
4. Band Substitution
Was this article helpful?