Content uploaded by Valery Starovoitov
Author content
All content in this area was uploaded by Valery Starovoitov
Content may be subject to copyright.
Content uploaded by Igor Zakharov
Author content
All content in this area was uploaded by Igor Zakharov
Content may be subject to copyright.
Fusion of Reconstructed Multispectral Images
Valery Starovoitov
Institute of Computer Science
University of Bialystok
Bialystok, Poland,
United Institute of Informatics Problems
Minsk, Belarus
valerys@newman.bas-net.by
Aliaksei Makarau
United Institute of Informatics Problems
Minsk, Belarus
makarau@newman.bas-net.by
Igor Zakharov and Dmitry Dovnar
Institute of Technology of Metals
Mogilev, Belarus
zakharov@ieee.org, dovnar@inbox.ru
Abstract—A new technique for fast fusion of multiresolution
satellite images with minimal colour distortion is presented in the
paper. The technique allows to reconstruct multispectral images
with resolution higher than resolution of the panchromatic
image. Combination of image super-resolution restoration and
image fusion based on global regression was applied. Super-
resolution image restoration is based on simultaneous processing
of several multispectral images to reconstruct a panchromatic
image with higher resolution. This method is quasi-optimal on
minimum squared errors of image restoration.
Image Fusion, Image Restoration, Super-resolution
I. INTRODUCTION
Image fusion or pan-sharpening is performed in order to
increase resolution of multispectral images utilizing the
panchromatic image information. Resolution of sharpened
images is limited by resolution of a panchromatic image. For
example, Landsat 7ETM+ provides panchromatic images with
resolution 15m and multispectral with 30m. Therefore,
resolution of the sharpened multispectral images is two times
higher than original one. The Brovey transform [1] and IHS
fusion [2] methods are point type methods, fast and applicable
for images of the large size. PCA and wavelet-based fusion
methods [3] are of global and local operator type and
computationally expensive. In satellites launched after 1999,
sensitivity range of the panchromatic image sensor is usually
extended to cover nearest infrared range. It is made in order to
increase resolution of the registered panchromatic image, but
previously developed fusion methods cause significant color
distortion [4].
We develop a principally new technique for multispectral
image fusion with resolution higher than resolution of the
panchromatic image and with minimization of color composite
distortion. Two tasks are to be solved in order to achieve the
higher resolution: panchromatic image reconstruction (super-
resolution) and multispectral image fusion.
To solve the first task one can use reconstruction of a
panchromatic image from color images by demosaicing
algorithm [5] either super-resolution reconstruction [6]. Our
solution is based on the method, which is quasi-optimal on
minimum squared error (MSE) of image restoration [7]. This
method allows a theoretical quality evaluation and super-
resolution restoration from several images. The solution of the
second task may be based on an algorithm of multiresolution
image fusion. We select a fusion algorithm based on the linear
regression. The algorithm is of point type (i.e. fast) and
provides minimal color composite distortion.
The restored panchromatic and original panchromatic
images were compared using the structural similarity index
(SSI) from [8]. Quality of enlarged images we evaluated by the
Euclidean norm (L2) of the histogram difference of the original
and the fused images, correlation, RMSE, square error of the
difference image (SEDI).
II. I
MAGE RESTORATION METHOD
The optimal solution of Fredholm’s integral equation of the
first kind may be used for image resolution increase [7]. Note,
that images are registered by a focal plane array (FPA).
A. Model of Image Formation
Information about optical properties of the original ideal
image
(
)
ληξ
,,Z is transmitted into an optical system, which is
characterized by a point-spread function (PSF)
(
)
ληξ
,,,, yxK .
The optical system projects an image
()
λ
,, yxf
onto the FPA.
This process can be described by a linear equation
() ()( )
,,,,,,,,,
1
1
2
2
ηξληξληξλ
ddyxKZyxf
S
S
S
S
∫∫
−−
=
(1)
where
21
, SS are integration limits, (
η
ξ
, ) are the coordinates
of a point in the plane of image
()
λ
η
ξ
,,Z ,
λ
is the
wavelength of light, (
y
x
, ) are the coordinates of point in the
plane of the registered image
()
λ
,, yxf
. The images with
different spectral bands and light wavelengths in (1) are
labeled by
),,,( pyxf
p
λ
, where Pp ,...1,0= ,
Pp
λ
λ
λ
λ
,...,
10
=
.
Spectral photosensitivity function
),,(
λ
yxSen of an FPA
element can be non-uniform [9]. The fill factor of FPA may be
The work was partly supported by INTAS project No.: 06-1000024-9100.
IGARSS'2007 Barselona, Spain
known from the manufacturer to describe the function.
Registration of the ideal image
()
λ
η
ξ
,,Z can be described by
Fredholm’s integral equation. For the image
),,( pjif registered by FPA the equation is
),,,(),,(),,(
),,,,,(),,(
1
1
2
2
1
pjiFpjipjif
ddpξjiKξZ
S
S
p
S
S
=+=
=
∫∫
−−
γ
ηξληλη
(2)
where
,),,,,,(),,(
),,,,,(
,
1
∫∫
=
=
ji
A
p
dxdypξyxKyxSen
pξjiK
ληλ
λη
(3)
),,( pjiF – signal registered by an FPA ),( ji element with
area
ji
A
,
, λ
p
– the light wavelength for image р, ),,( pji
γ
–
error of the signal (additive noise). We can write
).,,(
),,,(),,(),,(
,
pji
dxdypyxfyxSenpjiF
ji
A
pp
γ
λλ
+
+=
∫∫
(4)
Hence, the function K (PSF) in (3) can be written as
⎪
⎪
⎩
⎪
⎪
⎨
⎧
=
=
=
=
.),,,,,(
.................................
,1),,,,,(
,0),,,,,(
),,,,,(
1
0
PpξyxK
pξyxK
pξyxK
pξyxK
P
λη
λη
λη
λη
(5)
B. Solution of the Integral Equation
Solution of (1) is an ill-posed problem. According to the
theory of regularization, small errors (noise)
),,( pji
γ
can lead
to huge dispersion in solution of the equation. It is possible to
build an approximated solution with regularized properties. Let
us consider an approximated stabilized solution. It is possible
to find the average value over the set of squared error of
restoration. The solution can be represented as decomposition
on arbitrary orthonormalized system of basic functions
),,(
λ
η
ξ
ψ
k
:
,,,),,(),,(
21
1
SScZ
k
kk
<<=
∑
∞
=
ηξληξψληξ
(6)
where
k
c are the decomposition coefficients. The initial image
Z and noise are non-correlated.
In comparison to Wiener method, the main advantage of
the presented image restoration method is calculation of the
MSE for small number of pixels in the initial image. This is
very important for reconstruction of image registered by FPA
[10]. Values of a signal
),,( pjiF are used for restoration
(
)
ληξ
,,
*
Z of the ideal continuous image
Z
. An algorithm
described in [7] was adopted for this restoration. This
algorithm is fast, but in comparison with blind image
restoration methods (e.g. [11]) has a disadvantage – the PSF is
to be known. Image restoration filter
),,,,,(
βηξ
r
pjiQ can be
calculated by equations in [10]. The filter is calculated using
stabilization parameters
,...),(
21
ββ
=β
r
by the equation:
=),,,,,(
βηξ
r
pjiQ
∑
∑
∑∑
=
=
==
+
=
G
m
m
k
mkkmm
m
k
kkm
m
k
kkm
d
dpjid
1
1
11
),)((
),()(),,()(
ϕϕββ
ηξψβϕβ
r
rr
,
(7)
where
∑∑
=
I
i
J
j
mm
jiji
kk
),(),()(
,
ϕϕϕϕ
are scalar multiplication
of images of basis functions, G is number of functions of object
decomposition, I x J is the number of pixels,
)(
β
r
ln
d
are
recurrently calculated coefficients
⎪
⎪
⎪
⎩
⎪
⎪
⎪
⎨
⎧
===−=
+
−=
∑
∑
∑
−
=
=
=
. ... ,2 ,1 ,1)(...; ,3 ,2 ;1 ..., ,2 ,1
;
),)((
),)((
)()(
1
1
1
ln
kdnnl
d
d
dd
kk
n
lm
m
k
mkkmm
m
k
mkkm
lm
β
ϕϕββ
ϕϕβ
ββ
r
r
r
rr
(8)
Summation is made over all sampling points of an image
),,( pjiF for the original image restoration
),,,,,(),,(),,(
,,
*
βηξληξ
r
pjiQpjiFZ
pji
∑
=
(9)
Knowledge of the PSF (5) is required for restoration based
on different spectral images.
Image restoration algorithm
Step 1. Enter images
),,( pjiF
,
Pp ,...1,0=
. Enlarge the
images by interpolation.
Step 2. Set parameters of the FPA: fill factor or
),,(
λ
yxSen
, SNR, enter values of the PSF
),,,,,( pξyxK
p
λ
η
for every image.
Step 3. Calculate a filter
),,,,,(
βηξ
r
pjiQ
and reconstruct
an image Z
*
by (9).
III. M
ULTISPECTRAL IMAGE FUSION USING GLOBAL
REGRESSION
Hill et.al. proposed an algorithm for multiresolution image
fusion [12]. The algorithm calculates local regression in a
sliding window with size nn
×
(n=5 in [12]) between degraded
(Pan
deg
) and scaled (Pan
low
) panchromatic and the spectral
image. Local regression analysis between the image Pan
low
and
the spectral image Mult
j
is calculated by
j
low
jjj
EPanBAMult ++= * .
(10)
A
j
, B
j
are the matrices of local regression parameters for the j-th
spectral image, E
j
is the matrix of residuals, Pan
low
is the
degraded and scaled down panchromatic image. The spectral
image Mult
j
and the matrix B
j
are resized to the size of the
panchromatic image by interpolation (entitle results as
h
j
Mult and
h
j
B ). The spectral image with high resolution is
calculated by
)(
deg
PanPanBMultMult
h
j
h
j
high
j
−+= ,
(11)
where
high
j
Mult is high resolution spectral image. This
algorithm is a local type algorithm, computationally expensive.
Keeping the matrix
h
j
B , which size is equal to the size of the
panchromatic image, is costly in memory.
Distribution of the local regression parameters B
j
of
Landsat 7 ETM+ images was analyzed. Median values of B
j
for
all spectral images and coefficients of the global linear
regression are very close. Since, the median values are close to
the coefficients of the global regression, the global regression
may be applied for multiresolution image fusion.
Algorithm for multiresolution image fusion based on global
regression
Input data: Mult
j
is the spectral image, Nir is the near
infrared image, Pan is the panchromatic image.
Step 1. For the Mult
j
and Pan images assign zero values for
the pixels representing cloud, water and shadow areas. The
mask can be calculated by the following formulas
,0)(;0)(
,1)..1,..1(
=<=>
=
NirB
TNirMaskTBMask
thenNMMask
(12)
where M,N is the size of the image, B is spectral image of blue
color range, Mask is the mask with zero pixels representing
water, shadow and cloud areas, T
B
and T
Nir
are the thresholds
for the B and the Nir images. The thresholds T
B
and T
Nir
were
calculated experimentally (200 and 40 for Landsat 7 ETM+).
Step 2. Degrade the Pan image by a low-pass filter
(
33× average filtering, for example), entitle as Pan
deg
. Scale
down Pan
deg
to the size of the spectral image by interpolation
(entitle as Pan
low
).
Step 3. Calculate the global regression coefficients
,*
j
low
jjj
EPanbaMult ++=
(13)
where a
j
and b
j
are the global regression parameters, E
j
is the
residuals matrix.
Step 4. Perform spatial scaling of the Mult
i
image to the size
of the panchromatic image (entitle the result as
'
j
Mult ).
Step 5. Increase resolution of the spectral image by
),(*
deg'''
PanPanbMultMult
jj
−+=
(14)
where
''
j
Mult
is the spectral image with increased resolution.
Assign 0 or 255 to the pixels of the fused
image, which values
are less than 0 or more than 255, respectively.
a)
b)
c)
d)
Figure 1. Example of image restoration from multispectral images: a) a
fragment of red spectrum image; b) the fragment of the panchromatic
image; c) interpolated image of red spectrum; d) image reconstructed from
multispectral, near infrared and panchromatic images.
IV. RESULTS OF ALGORITHMS EVALUATION
A. Image Restoration Results
Fig. 1 (a) and (b) illustrate fragments of red spectrum image
and original panchromatic image. Fig. 1 (c) presents
interpolated image of red spectrum. Fig. 1 (d) presents restored
image using multispectral, near infrared and panchromatic
images.
B. Quality Assessment of Image Restoration Results
The restored image was compared with interpolated spectral
and interpolated panchromatic images by calculation of
correlation, RMSE, square error of the difference image
(SEDI) and structural similarity (see Table 1). The restored
image has bigger correlation with the interpolated Nir image
than with other images. Utilization of the Nir image for Pan
image restoration is desirable.
TABLE I. COMPARISON OF INTERPOLATED SPECTRAL IMAGES WITH
THE
RESTORED IMAGE SHOWN IN FIG. 1(D)
Interpolated image Correlation RMSE SEDI SSI
Pan
0.988
18694 4.15 0.88
R 0.143 42481 13.33 0.59
G 0.231 52519 24.19 0.56
B 0.390 40564 16.76 0.62
Nir 0.949 45185 11.01 0.60
C. Image Fusion Results
Results of the presented algorithm were compared with the
Brovey, IHS and local regression fusion methods. Spectral
a) b)
c) d)
Figure 2. Fusion by different methods: a) a fragment of an original
multispectral image (Landsat 7 ETM+); b) the panchromatic image, c) the
IHS fusion with original panchromatic image; d) the IHS fusion with
reconstructed panchromatic image.
images with size of
400400 ×
pixels and restored panchromatic
images with size
16001600 ×
pixels were used. Fig. 2 presents
(a) a fragment of a multispectral image, (b) the original
panchromatic image, (c) the IHS fusion, (d) the IHS fusion
with restored panchromatic image (the resolution is 4 times
higher then the resolution of the multispectral image, and 2
times higher than the resolution of the panchromatic image).
TABLE II. NUMERICAL EVALUATION OF THE FUSION METHODS. THE
ORIGINAL MULTISPECTRAL IMAGE OF SIZE
400400×
, THE FUSED IMAGE OF
SIZE
16001600×
PIXELS.
Method
L2 histogram
Norm (R,G,B)
Correlation
(R,G,B)
RMSE
(R,G,B)
SEDI
(R,G,B)
Consumed
time, s
Brovey
Fusion
31045
32443
42114
0.5294
0.3384
0.0189
7654
12144
12247
9.9783
10.0945
14.0574
16.171
IHS
Fusion
162152
162610
164330
0.1101
0.0889
0.1409
21578
25116
29445
16.3232
9.91
7.8131
13.844
Local
regression
3328
2643
4258
0.9858
0.9851
0.9738
1151
704
744
1.9249
1.1786
1.2391
77.703
Global
regression
3372
2785
4604
0.9858
0.9851
0.9744
1157
704
739
1.9376
1.1855
1.2348
15.578
D. Visual Analysis
All the discussed algorithms increase the spatial resolution,
but the Brovey and IHS fusion methods heavily distort the
color composite. Visually there is no color distortion in the
local and global regression fusion results, but the local
regression fusion adds noise near edges and in the
homogeneous areas in the resulting image.
E. Quantitative Analysis
The Euclidean norm (L2) of the histogram difference of the
original and the fused images, correlation, RMSE, square error
of the difference image (SEDI) were used in quantitative
analysis. The fused images were scaled down to the size of the
original multispectral images by bilinear interpolation. Ideal
values for all the measures are 0, but for correlation is 1. Table
2 presents assessment of image fusion by the Brovey, IHS,
local and global regression fusion. Size of the original
multispectral image is
400400
×
pixels. Size of the fused images
is
16001600
×
pixels. The best values are indicated by bold font.
Fusion based on the local and global regression outperforms
widely used the Brovey and IHS fusion methods. Calculation
time of the global regression fusion is less than calculation time
of the Brovey fusion and comparable to the IHS fusion.
Experiments were carried out in Matlab R14.
V. C
ONCLUSION
We presented a new technique for multispectral image
fusion with minimization of color composite distortion. The
resolution of the fused image is higher than resolution of the
original panchromatic image. This was done in two stages: by
super-resolution of panchromatic image using the original
multispectral and panchromatic images; then multispectral
image fusion with the reconstructed panchromatic image. This
allows to increase resolution of the image 2-4 times higher than
the original panchromatic image resolution without loss of
acutance and color composite distortion.
R
EFERENCES
[1] Y. Zhang, “Problems in the fusion of commercial high resolution
satellite images as well as Landsat 7 images and initial solutions,” Joint
Int. Symposium on GeoSpatial Theory, Processing and Applications,
Ottawa, Canada, July 2002, vol. 34, part.4.
[2] T. Tu, S. Su, H. Shyu, and P. Huang “A new approach at IHS-like image
fusion methods,” Information Fusion, vol. 2, no. 3, pp. 177–186, 2001.
[3] Z. Wang, D. Ziou, C. Armenakis, D. Li, and Q. Li, “A comparative
analysis of image fusion methods,” IEEE Trans. Geoscience and Remote
Sensing, vol. 43, no. 6, pp. 1391–1402, 2005.
[4] T. Tu, Y. Lee, C. Chang, and P. Huang “Adjustable intensity-hue-
saturation and Brovey transform fusion technique for
IKONOS/Quickbird imagery,” Optical Engineering, vol. 44, no. 11, pp.
116201-1–116201-10, 2005.
[5] R. Kimmel, “Demosaicing: Image reconstruction from color CCD
samples,” IEEE Trans. Image Proc., vol. 8, no. 9, pp. 1221–1228, 1999.
[6] R. Molina, J. Mateos, and A. Katsaggelos “Super resolution
reconstruction of multispectral images,” VIRTUAL OBSERVATORY:
Plate Content Digitization, Archive Mining & Image Sequence
Processing, Heron Press, 2005.
[7] D. Dovnar and K. Predko, “The method for digital restoration of object
distorted by linear system,” Acta Polytechnica Scand. Applied Physics
Series, vol. 1, no 149, pp. 138-141, 1985.
[8] Z. Wang and A. Bovik “A universal image quality index,” IEEE Signal
Proc. Letters, vol. 9, no. 3, pp. 81-84, 2002.
[9] D. Kavaldiev and Z. Ninkov, “Influence of nonuniform charge–coupled
device pixel response on aperture photometry,” Optical Engineering.
vol. 40, no. 2, pp. 162–169, 2001.
[10] D. Dovnar and I. Zakharov “The orthogonalization method for error
compensation of Wiener filter for spatial discreditized images,” in Proc.
8th Int. Conf. on Pattern Recognition and Information Proc., Minsk,
Belarus, May 2005, pp. 173–176.
[11] R. Molina, J. Mateos, and A. Katsaggelos “Blind deconvolution using a
variational approach to parameter, image, and blur estimation,” IEEE
Trans. on Image Processing, vol. 15, no. 12, pp. 3715-3727, 2006.
[12] J. Hill, C. Diemer, O. Stover, and T. Udelhoven. “A local correlation
approach for the fusion of remote sensing data with different spatial
resolution in foresty applications,” in Proc. of Int. Archives of
Photogrammetry and Remote Sensing, Valladolid, Spain, June 1999,
vol. 32, no. Part 7-4-3 W6, pp. 167–174.