ThesisPDF Available

Remote Sensing Optical Images Applications in Vegetation Monitoring and Mapping, Part of Baghdad Supervised by

Authors:

Abstract and Figures

Vegetation monitoring is considered an important application in the remote sensing task due to the variation of vegetation types and their distribution around the world. In this study, the vegetation monitoring in Baghdad city was done using Normalized Difference Vegetation Index (NDVI) for temporal Landsat satellite images (landsat-5 Thematic Mapper (TM)and landsat-8 Operational Land Imager (OIL)). The results were evaluated using the ENVI (Environment for Visualizing Images) version. 4.8 remote sensing and image processing package as well as written subroutines in particulate work steps This study presents that (NDVI) method is a suitable and useful way for monitoring vegetation. The calculation of vegetable areas was evaluated after images were classified and extracted, such as supervised classification techniques minimum distance classifier. Different change detection techniques were investigated, such as subtractive, rationing, Principal Component Analysis (PCA), methods for the thermal band for selective periods. the mean of b10 b11 in TIR, (2015 and 2017) was done using a simple program in visual basic 6.0. the result of this study vegetation area goes to decrease from 43 km2 in 2000 to 9 km2 in 2015 among the interested area. After that, the value increases to 22 km2. This is true due to the rain full where The values of rainfall are decreased from 34mm to 11 mm from 200 to 2015 and will be rise to 31 mm in 2017, The image differencing and rationing methods yield the changes in the vegetation areas as dark tone pixels, and their result is more suitable with the NDVI results. On the other hand, the PCA detection method is poor with the thermal bands due to the low correlation between the merged thermal bands.
Content may be subject to copyright.
Republic of Iraq
Ministry of Higher Education
and Scientific Research
University of Baghdad / College of Science
Department of Physics
Remote Sensing Optical Images Applications in
Vegetation Monitoring and Mapping, Part of
Baghdad
A Thesis
Submitted To the College of Science,University of Baghdad,
In Partial Fulfillment of the Requirements for the Degree Master of
Science of In Physics / Remote Sensing and Image Processing
By
Abeer Nazar Abdul-Hameed
B.Sc. 2006
Supervised by
Asst. Prof. Dr. Alaa S. Mahdi
2021 A.D. 1442 A.H.


































 















Acknowledgments
Praise be to Allah Almighty, who gave me patience and
determination to complete this work.
I would like to extend my sincere thanks and appreciation to
Assist. Prof. Dr. Alaa S. Mahdi for his scientific follow-up,
patience and guidance for writing this thesis.
I would also like to thank the Dean of the College of Science, the
Head of the Physics Department, and the Head and Associates of
the Remote Sensing Unit, College of Science, and University of
Baghdad.
Special thanks are given to Prof. Dr. Ghaith F. Nehme and
Assist. Prof. Dr. Fouad K. Mashee.
Finally, I thank my father, my mother, my brothers, my husband,
my children, my friends, and everyone who supported me to
accomplish this work.
Abeer
I
Abstract
Vegetation monitoring is considered an important application in the remote
sensing task due to the variation of vegetation types and their distribution around
the world. In this study, the vegetation monitoring in Baghdad city was done using
Normalized Difference Vegetation Index (NDVI) for temporal Landsat satellite
images (landsat-5 Thematic Mapper (TM)and landsat-8 Operational Land Imager
(OIL)). The results were evaluated using the ENVI (Environment for Visualizing
Images) version. 4.8 remote sensing and image processing package as well as
written subroutines in particulate work steps This study presents that (NDVI)
method is a suitable and useful way for monitoring vegetation. The calculation of
vegetable areas was evaluated after images were classified and extracted, such as
supervised classification techniques minimum distance classifier. Different
change detection techniques were investigated, such as subtractive, rationing,
Principal Component Analysis (PCA), methods for the thermal band for selective
periods. the mean of b10 b11 in TIR, (2015 and 2017) was done using a simple
program in visual basic 6.0. the result of this study vegetation area goes to
decrease from 43 km2 in 2000 to 9 km2 in 2015 among the interested area. After
that, the value increases to 22 km2. This is true due to the rain full where The
values of rainfall are decreased from 34mm to 11 mm from 200 to 2015 and will
be rise to 31 mm in 2017, The image differencing and rationing methods yield the
changes in the vegetation areas as dark tone pixels, and their result is more suitable
with the NDVI results. On the other hand, the PCA detection method is poor with
the thermal bands due to the low correlation between the merged thermal bands.
II
List of Contents
contents
Page No.
Abstract
I
List of Contents
III
List of Figures
V
List of Tables
VII
List of Abbreviations
VIII
Chapter One: General Introduction
1.1
Preface
1
1.2
Overview of Remote Sensing
3
1.3
Radiation and Electromagnetic Spectrum
4
1.4
Vegetations Monitoring By Thermal Bands
5
1.5
The Types of Remote Sensing System
8
1.5.1
Passive Sensing and Sensors
9
1.5.2
Active Sensing and Sensors
9
1.6
Some Types of Sensors and Their Characteristics
10
1.6.1
Optical Sensors
10
1.6.2
Thermal Infrared Sensors
10
1.6.3
Microwave Sensors
11
1.7
The Aims of Study
12
1.8
The Region of Interest and Available Data
12
1.9
The Literatures Survy
15
1.10
Thesis Layout
18
Chapter Two: Theoretical Background
2.1
Introduction
19
2.2
Thermal Sensing Concepts and Spectrum
20
2.3
The Operational Land Imager
23
2.4
The Normalized Difference Vegetation Index
26
2.5
Thermal Properties of Vegetation
28
2.6
The Digital Image
29
2.7
Change Digital Detection
30
2.7.1
The Digital Change Detection Conditions
31
2.7.2
The Thresholding
32
2.7.3
Image Ratioing
32
III
contents
Page No.
2.8
Principal Component Analysis as Change Detection
33
2.9
The Image Classification
35
2.9.1
Supervised Classification
35
2.9.1.1
Maximum Likelihood Classification
37
2.9.1.2
Minimum Distance Classification
39
2.9.2
Unsupervised Classification
42
2.9.2.1
Isodata Classification Method
42
2.9.2.2
K-Means Classification Method
43
2.10
Statistical Digital Image
43
2.11
Proposed Statistical Model
44
Chapter Three: Results and Discussion
3.1
Diagram of the Work Procedure
45
3.2
The Available Data
46
3.3
The Satellite Image Subset
48
3.4
Normalized Vegetation Index (NDVI)
53
3.5
Classification of the NDVI image
58
3.6
The Climatic and Weathering Factors
62
3.7
The Digital Change Detection for Thermal bands
65
3.7.1
Image Differencing
69
3.7.2
Image Rationing
75
3.7.3
Principal Components Analysis for Change
Detection
81
3.7.3.1
Mathematical Formulation
82
3.7.3.2
The PCA Transform Application
85
3.8
Results Discussion and Analysis
91
Chapter four: Conclusions And Recommendations
4.1
Conclusions
93
4.2
Recommendations for Future Works
94
References
95
IV
List of Figures
Fig. No.
Subject
(1.1)
The Electromagnetic Waves Propagation
(1.2)
The Electromagnetic Spectrum Regions
(1.3)
The spectral reflectance of vegetation
(1.4)
Passive and active remote sensing System
(1.5)
Relationship between energy and wavelength
(1.6)
(a) Map of Study Area and (b) Image of The
Landsat-8
(2.1)
Diagram of Factors Controlling Radiant
Temperature.
(2.2)
Represents the relationship of the intensity of black
body radiation with the wavelength.
(2.3)
The OLI Sensor on Board Landsat 8
(2.4)
Structure of a Digital Image and Multispectral Image
(2.5)
The Supervised Classification Steps
(2.6)
Two-Dimensional Multispectral Space with
Gaussian Probability Distributions Representing the
Spectral Groups
(2.7)
Illustrates the supervised parallelepiped classification
process, performed on multispectral bands (i.e., red
band 3 versus near-infrared band 4).
(2.8)
The Unsupervised Classification Technique
(2.9)
A Simple Statistical Model For Image Processing In
Order to Optimize Its Functionality
(3.1)
Diagram of the Work Procedure
(3.2)
A-A full scene of Landsat Series, period from 2000,
2010, 2015 and2017.
V
Fig. No.
Subject
B-The Composite of study area
(3.3)
A-The Composite of b3, b4 and b5 Image, 2000
B-The Composite of b3, b4 and b5 Image, 2010
C-The Composite of b3, b5 and b7 Image, 2015
D-The Composite of b3, b5 and b7 Image, 2017
(3.4)
A-The NDVI Image, 2000
B-The NDVI Image, 2010
C-The NDVI Image, 2015
D-The NDVI Image, 2017
(3.5)
A-The Classification NDVI Image by Supervised
Classification (minimum distance classifier), 2000
B-The Classification NDVI Image by Supervised
Classification (minimum distance classifier), 2010
C-The Classification NDVI Image by Supervised
Classification (minimum distance classifier), 2015
D-The Classification NDVI Image by Supervised
Classification (minimum distance classifier), 2017
(3.6)
The average monthly temperature(˚C)from May to
October of the study area during (1998 - 2018) years.
(3.7)
The average monthly rainfall(mm) from November
to April of the study area during (1998 2018) years.
(3.8)
A-The Thermal band of TM5, 2000
B-The Thermal band of TM5, 2010
C-The Thermal band of TIR, 2015
D-The Thermal band of TIR, 2017
(3.9)
A-The 2017 th. Band minus 2015 th. Band, With
threshold 70
B-The 2017 th. Band minus 2010 th. Band, With
threshold 85
VI
Fig. No.
Subject
C-The 2017 th. Band minus 2000 th. Band, With
threshold 77
(3.9)
D-The 2010 th. Band minus 2000 th. Band, With
threshold 77
E-The 2015 th. Band minus 2000 th. Band, With
threshold 77
(3.10)
A-The 2017 th. Band Over 2010 th. Band
B-The 2010 th. Band Over 2000 th. Band
C-The 2017 th. Band Over 2000 th. Band
D-The 2015 th. Band Over 2000 th. Band
(3.11)
The Eigen Values of Transform
(3.12)
A-The First Principal Component, no Change
B-The Second Principal Component, Change Image
C-The Third Principal Component, Noise Image
D-The Fourth Principal Component, Noise Image
VII
List of Tables
Tabel No.
Subject
Page No.
(1.1)
Main Features of The Landsat-8 (OLI and TIRS)
And Landsat-5 (TM)
2
(1.2)
The Landsat Scenes Details
13
(3.1)
Source of Image, Dates, Composites Bands And
Resolution (Meters) of Captured Images of Baghdad
City
47
(3.2)
Information of Landsat Bands
53
(3.3)
The Vegetable Areas Calculation From The NDVI
Images
58
(3.4)
Represent Average Temperature and Average
Rainfall of Bagdad City
63
VIII
List of Abbreviations
Abbreviation
Meaning
TM
Thematic Mapper
OLI
Operational Land Imager
TIRS
Thermal Infrared Sensor
RS
Remote Sensing
RISAT
Radar Imaging Satellite
EM
Electromagnetic
VIs
Vegetation Indices
NDVI
Normalized Difference Vegetation Index
NIR
Near-Infrared Band
RED
Red Band
DN
Digital Number
BV
Brightness Value
IR
Image Ratio
EMR
Electromagnetic Radiation
PCA
Principal Component Analysis
ROI
Region Of Interest
TH band
Thermal Band
ENVI
Environment For Visualizing Images
USGS
United States Geological Survey
UTM
Universal Transverse Mercator
WGS
World Geodetic System
Chapter One
General Introduction
Chapter One General Introduction
1
Chapter One
General Introduction
1.1. Preface
Formal monitoring of land and ocean surfaces is critical for
understanding Earth's processes and rhythms. This process has been done
continuously for many hours. Many new models of this type of new program
are being developed every year. The Landsat Satellit has proven to be among
the most effective and reliable. Information about natural and human-induced
changes is now available at moderate resolution, on a large scale, and continues
into the present, where anomalies are observed (USGS, 2013). The Landsat
system of satellites has been a premier resource for mapping and tracking land
cover over the last four decades for land cover, bio physic and geophysical AL
properties, of the last three Landsats, only Landsat 7 encountered an issue that
produced missing data, and this one entered orbit. The rest worked on a
schedule,[1,2].
In California, On March 1, 1984, the Landsat-5 satellite was launched
from Vandenberg Air Force Base, which used instruments similar to those on
the Landsat-4 satellite: the Multispectral Scanner sensor, the Thematic Mapper
sensor. Orbited the Earth for nearly 29 years and set a new record in the set a
Guinness of 'Longest operating Earth observation satellite', The Lansat5
satellite was withdrawn from service in 2013. Landsat 7 was launched from
Vandenberg Air Force Base in California on April 15, 1999 on a Delta II rocket.
The satellite carries the Enhanced Thematic Mapper (ETM+) sensor. The
satellite has a 16-day repeat cycle Landsat 7 carries the Enhanced Thematic
Mapper Plus (ETM+) sensor, an improved version of the Thematic Mapper
instruments that were onboard Landsat 4 and Landsat 5. Landsat 7 products are
delivered as 8-bit images with 256 grey levels [3].The satellite Landsat-8
(formerly known as the Landsat Data Continuity Mission, LDCM) was
launched from Vandenberg, California, on an Atlas-V rocket on February 11,
Chapter One General Introduction
2
2013. Landsat-8 is the most recently launched Landsat satellite, carrying two
instruments, the Thermal Infrared Sensor instrument and the Operational Land
Imager instrument. Table (1-1) shows the main features of the Landsat-8 (OLI
and TIRS) image bands and Landsat-5 (TM) image bands.
Table (1.1), Main features of the Landsat-8 (OLIandTIRS) and Landsat-5 (TM)
Landsat-5
Landsat-8 OLI and TIRS
Band
(No./ Name)
Wavelength
(μm)
Resolu-
tion (m)
Band
(No./ Name)
Wavelength
(μm)
Resolu-
tion (m)
1/ Blue
0.441-0.514
30
1/ Coastal-
Aerosol
0.435-0.451
30
2/ Green
0.519-0.601
30
2/ Blue
0.452-0.512
30
3/ Red
0.631-0.692
30
3/ Green
0.533-0.590
30
4/ Near
Infrared (NIR)
0.772-0.898
30
4/ Red
0.636-0.673
30
5/ Short-wave
Infrared
(SWIR1)
1.547-1.749
30
5/ Near
Infrared
(NIR)
0.851-0.879
30
6/ Thermal
Infrared (TIR)
10.40 -12.50
120
6/ Short-wave
Infrared
(SWIR1)
1.566-1.651
30
7/ Short-wave
Infrared
(SWIR2)
2. 064-2.345
30
7/ Short-wave
Infrared
(SWIR2)
2.107-2.294
30
8/
Panchromatic
0.503-0.676
15
9/ Cirrus
1.363 -1.384
30
10/ Thermal
Infrared (TIR)
10.60-11.19
100/60
11/ Thermal
Infrared (TIR)
11.50-12.51
100/60
Chapter One General Introduction
3
1.2. Overview of Remote Sensing
The science and the art of acquiring information about an (entity, area
or phenomenon) by analyzing information obtained by a computer that is not
in contact with the object, area or phenomenon being examined is remote
sensing. You use remote sensing when you read these sentences. Your eyes
serve as sensors to the light that this page reflects. The "information" that you
get from your eyes are impulses that match the amount of light that the dark
and light areas on the page represent. These data are processed or interpreted
on your mental machine to clarify the dark areas on the page in the form of
letters.
Furthermore, Those phrases shape words and the details the phrases express are
interpreted. Remote sensing can be considered a reading process in many ways.
We can remotely collect information using different sensors, which can be
analyzed for information on the objects, areas or phenomena under study. The
data remotely collected can be of several types, including varying force,
acoustic wave, or electromagnetic energy[4].
Some remote sensing technology implementations:
Urban planning and Environmental pollution (for example, expansion of
the metropolis and waste that is toxic).
Detection and surveillance of global transition, such as Ozone loss in the
atmosphere, deforestation.
Farming (the state of the crops' health, yield forecast).
Search for the resource of nonrenewable, such as minerals, natural gas
etc.
Natural resources for renewable energy ( e.g., wind energy, Hydro
energy, and solar energy)
Military monitoring and recognition such as Strategy plan, tactical
evaluation etc,[5].
Chapter One General Introduction
4
1.3. Radiation and electromagnetic spectrum
Electromagnetic radiation is a form of energy that travels as both a
motion and a wave. And it has two main components, Electric field (E) and
Magnetic field (M), which are perpendicular to the direction of propagation,[6]
, as shown in Fig. (1.1).
Figure (1.1): The Electromagnetic Waves Propagation,[7].
The following formula expresses the relationship between the parameters of the
wave motion:
  ………. (1-1)
where:  is the light speed which equals , is the wave frequency
(Hz), and : wavelength (nm).
The electromagnetic spectrum (EM) is a continuous range of electromagnetic
radiation, ranging from gamma rays (highest frequency and shortest
Chapter One General Introduction
5
wavelength) to radio waves (lowest frequency and longest wavelength) and
including visible light, as presented in Fig. (1.2).
Figure (1.2): The Electromagnetic Spectrum Regions,[8].
1.4. Vegetations Monitoring by Thermal Bands
Vegetation and landscaping are mutually dependent: Vegetation and
landscaping must accommodate both man-mode and climate. The most
efficient and environmentally sound method is to use as much of the available
heat as possible, particularly in areas that experience colder temperatures. In an
environment where it is humid, it is best to use modifications to create cooling.
An arid world would not have adequate water available to sustain until new
sources of moisture are found, or new moisture is added using natural means,
[6].
Trees have a great cooling effect on cities because they have both shade
and "by evaporation." To go through the plant morphogenesis, plants can "hiss"
or "let go" water out of their leaves in a similar fashion to people. When water
evaporates, it causes the air to lose heat, making the surrounding environment
colder. To conduct or replace up to 40 gallons of heat from a day for every 30
feet of a single mature, the adequately watered tree requires 30 gallons of water,
Chapter One General Introduction
6
which is roughly equivalent to shutting off an electric heater for four hours.
This would have a significant cooling effect, if practiced in hot and dry
climates, raising the ambient temperature by 3 to 6°C in the open country
compared to no trees or trimming vegetation, which could have the same effect
as a smaller one in more developed environments the Weather conditions at the
significance and desirability of which changes depending on the moisture and
temperature of the surrounding,[8].
Trees have a huge impact on weather patterns, particularly on how much
and when precipitation falls, by holding back and releasing moisture and how
warm or cold it is during the warmer and colder seasons. Trees that are the best-
sized, best-located, and most appropriate for any project will yield the most
benefits. Two effective strategies ensure maximal benefits:
1- (South and west of a building would be partially shaded by) deciduous trees
or full sun by the presence of deciduous trees These trees help to cool the urban
areas and building surfaces during hot periods, sunny weather but allow for a
continual flow of sunlight when it is cool. Trees and shrubs located to the
northwest will help protect buildings from the harshening cold winter winds
and snow accumulation.
2- Trees situated next to one another will transform into a park while in the
area, while doing so, they will offer shade and coolness to local residents.
Grouped trees have a better chance of reaching maturity and living longer in
shielding from the wind and the sun, particularly in sparse, rugged habitats.
There have been several studies about the temperature benefits of large parks
and wetlands, all of which have been noted in connection with the reduction of
urban heat islands. [populated] regions and their immediate environs. It is
believed that the immediate vicinity is slightly cooler than the built-up regions
by approximately (2 to 3)°C in relation to the temperature in the air,[9].
Traditional methods of monitoring vegetation cover are practical for
small areas labor-intensive, time-consuming. Remote sensing provides more
detailed data than conventional approaches. It offers continuous coverage over
Chapter One General Introduction
7
a wide area and functions as a continuous measurement. The spectral properties
of vegetation depend on its leaves (especially the arrangement of the leaves),
cell or component properties (tissue type and makeup), and the amount of water
in the leaves. Green plant leaves are very reflective in the near-infrared
(between 0.4 and 0.7 micrometer due to them both photosynthetic and
accessory pigments like chlorophylls and anthocyanins). Controls this property
is determined mainly by leaf structure (reflectance in the mesophyllic zone 0.7
- 1.1 μm). This clear wavelength separation of plant reflectance through the
visible and near-infrared portions of the electromagnetic spectrum assists with
surveying techniques,[10].
The Spectral Reflectance of vegetations
Vegetation can be divide into forests, trees, and fields and has a unique
spectral signature, the reflective characteristics vary according to different
vegetation types. By distinct spectral signature, it is normally able to identify
healthy or unhealthy. The spectral reflectance of the vegetation is affected leaf
pigment, cell structure and water content. Deciduous trees that have a higher
reflectance in the near-infrared are an example. Fig. (1.3) indicates the spectral
reflection of the vegetation.
Visible reflectance
The leaf pigments mainly absorb sun light in the visible region of the image. The
energy is absorbed strongly by chlorophyll in the blue and red wavelengths and
reflects more green wavelengths, so uninfected vegetation shows green color.
Near-Infrared reflectance
Due to the cellular structure of the leaves, especially the spongy mesophyll,
healthy vegetation has much higher reflectance in the near-infrared (NIR) region
than in the visible region. Thus, healthy vegetation is easily identifiable by its
high NIR reflectance and low noticeable reflectance in the visible region.
Shortwave Infrared reflectance
Chapter One General Introduction
8
In short-wave infrared wavelengths, the reflection belongs to the structure of
vegetation and the content of water in its absorption bands of water are highly at
the wavelengths of 1.45, 1.95, and 2.50 μm. Therefore, the reflectance generally
increases because the water content of the leaves is reduced outside the SWIR
zone.[9,10].
Figure (1.3): The spectral reflectance of vegetation (The chlorophyll pigment
absorb the Blue (0.45μm), Red (0.65 μm) parts, and reflect the Green (0.55
μm)),[13].
1.5. The Types of Remote Sensing System
Remote sensing is the instrumentation technique and method used to
observe the Earth's surface at a distance and interpret the images or numerical
values obtained in order to obtain meaningful information about particular
objects on Earth. There are two kinds of remote sensing ,[14].
1.5.1. Passive Sensing and Sensors
The sun is the ultimate energy source for life on the earth, and the same is true
for remote sensing, as shown in Fig. (1.4). Solar energy is either reflected as it
is for VIS-NIR wavelengths or absorbed and then re-emitted for thermal IR
Chapter One General Introduction
9
wavelengths. Passive remote sensing devices measure the energy that is
naturally available and is called passive sensors. Because of this, passive
sensors are capable of detecting energy when all reflected or re-emitted solar
energy is visible on Earth's surface during the time of sunlight. At night, there
is no reflection of solar energy. However, day and night sensing of naturally
emitted solar energy, such as thermal IR, is possible. Landsat, IRS, SPOT
series, IKONOS, Quickbird, etc., are examples of passive sensors, [10].
1.5.2. Active Sensing and Sensors
Active sensors have own source of energy for the illumination of the features
to be examined, as shown in Fig. (1.4). The active sensor records the energy
reflected by the target feature. Therefore, one of the key advantages of active
sensors is their ability to record energy at any time, irrespective of the time of
day or season. For example, flash photography is active by Remote Sensing in
contrast to the available light photography, which is passive. Active sensors are
constructed in such a way that they can be used to detect wavelengths that are
not adequately supplied by the sun on Earth's surface, such as microwaves.
Active sensors can also be operated when illuminating the way the target is
being sensed. active sensors (e.g. Synthetic Aperture Radar (SAR ) or LiDAR)
,[10].
Figure (1.4): Passive and Active Remote Sensing System,[16].
Chapter One General Introduction
10
1.6. Some Types of Sensors and Their Characteristics
There are many sensors and instruments in the remote sensing process,
including different spectral, spatial, and temporal resolutions. The following
represent some types of sensors with many classifications criteria.
1.6.1. Optical Sensors
Optical infrared remote sensors are used to record reflected/emitted radiation
of visible, near middle and far-infrared regions of electromagnetic radiation.
They can observe for wavelength extended from (400-2000) nm. Sun is the
source of optical remote sensing, [17].
1.6.2. Thermal Infrared sensors
Thermal imagers are cross track scanners that only the space thermal
portion of the spectrum detect emission radiation. Thermal remote sensing is a
passive sensor since it senses radiation that naturally emits. To prevent the
detectors from giving off any heat, they are cooled to temperatures as near to
absolute zero as possible. The primary purpose of a thermal sensor is to
calculate the surface temperature and the other thermal properties of its target.
Radiation is sensed within wavelengths (3 to 14)μm of the electromagnetic
spectrum, as shown in Fig. (1.5), relationship between energy and wavelength.
It is used to measure the soil and ocean temperatures and humidity and the
Earth's radiation balance. So it became apparent that researchers are strongly
reliant on Landsat-5, 7 and 8's thermally accurate measurements to monitor
how land and water are being used, TIRS subsequently became a component
of the Landsat program, [18]. Remote thermal sensing calculates an object's
radiant temperature. The radiant temperature is an indicator of an object's
released radiation. Often this is also called the brightness or the "apparent
temperature" of an object.
The electromagnetic radiation emanates from all objects with a
temperature above absolute zero (0 k). When describing the temperature on
the kinetic scale, we talk about the temperature between (0 and 100) degrees
Celsius. Kinetic temperature or heat is produced by all things that vibrate.
Chapter One General Introduction
11
Temperature is often referred to as “kinetic”. The temperature can be
calculated using a thermometer or standard scales (°F, °C, Kelvin). The true
temperature of an object is almost always proportional to its radiant
temperature. Thus, we can use radiometric tools such as radiometers to
calculate radiant temperature (using emitted energy), [18].
Figure (1.5): Relationship Between Energy and Wavelength.[19]
1.6.3. Microwave Sensors
These types of sensors receive microwaves with a longer wavelength
than visible light and infrared rays. Observation shall not be affected by day,
night, or weather. The microwave portion of the spectrum shall contain
wavelengths within an estimated range of 1 mm to 1 m. The longest microwave
is around 2,500,000 times longer than the shortest light waves. Microwave
sensing has two types, [13,14];
a) Active sensor - The sensor emits microwaves and detects microwaves
represented by the land surface characteristics, [20]. It is used to analyze the
conditions of mountains, valleys, ocean floors, waves and ice, [22].
b) Passive sensor - This type of sensor tracks microwaves that have radiated
naturally from the earth's surface features. It is necessary to observe the
temperature of the sea surface, snow formation, ice thickness, soil humidity
Chapter One General Introduction
12
and hydrological applications, etc. RISAT is a remote sensing satellite from
India which provides microwave data, [20].
1.7. The Aims of Study
The study aims to study and monitor the vegetation areas growing and
health using the satellite imagery in the visible and thermal bands using Landsat
data for Baghdad city as the region of interest.
1.8. The Region of Interest and Available Data
The studied area in this work is Baghdad, the capital of Iraq, as shown in
Fig. (1.6). It extends from (Lat. 33° 13 32′′ N) and from (Lon. 44° 28 19 ′E)
with an area of 204.2 km2. Baghdad city divides into two residential parts: first,
it lies on the east side and is named Rasafa. Second, it lies on the west side and
is named Karkh. The Tigris river separates both parts, so the city lies on the
banks of this river. [23]. The average highest of Baghdad city is 36 m above
the level sea (DEM), Digital Elevation Model. The Baghdad's population
census is about 8,126,755 with population density 49,019 according in 2018,
[24]. This city is characterized by the abundance of palm trees, representing the
most widespread tree in these areas. Baghdad's climate is hot and dry in summer
and cold/humid in winter due to the low and flat topography of the city. The
images of Baghdad city were captured by Landsat-5 and Landsat-8 as shown
in Fig. (1.6). The available data for this project are four full bands, Landsat
scenes, the details of these scenes will be shown in table (1.2).
Chapter One General Introduction
13
(a)
(b)
Figure (1.6): (a) Map of study area and (b) Image of the lansat-8.
Table (1.2): The Landsat Scenes Details.
Sensor
Bands
Date
Path
Row
Resolution, m
Scene ID
Landsat5 TM
7
2000
168
37
30
LT51680372000364RSA00
Landsat5 TM
7
2010
168
37
30
LT51680372010343MTI00
Landsat8
OLI
11
215
168
37
30
LC81680372015341LGN01
Landsat8
OLI
11
2017
168
37
30
LC81680372017346LGN00
Al scene bands were available with georeference UTM projection and 30 m in
spatial resolution for all bands, including the thermal bands. The feature
accuracy of matching for scenes and bands is less than half-pixel, this will
appear clearly in the process of image subsets, where the process build on the
UTM coordinates system.
1.9. The Literatures Survy
A. Manickavasagan et al. (2005), [25],in numerous agricultural pre-
harvest and post-harvest operations, use the thermal imaging process. The
Chapter One General Introduction
14
chances remain in the experimental stage, however. Intensive research is
needed to improve efficiency and eventually net profit for farmers in real-
time applications. Several researchers have studied the plant, soil, and water
relationship through thermal screening in detail. The results of these
research projects will provide useful data for detailed site management and
precise cultivation. Thermal imaging approaches can also be used for
classification in post-harvest operations.
Jose A. J. Berni et al. (2009), [26], This paper demonstrates the ability to
generate quantitative remote sensing products by means of a helicopter-
based UAV equipped with inexpensive thermal and narrowband
multispectral imaging sensors. During summer of 2007, the platform was
flown over agricultural fields A commercial peach orchard near Cordoba
(Spain) was assess if water stress levels could be detected , obtaining
thermal imagery (40-cm resolution) and narrowband multispectral imagery
(20-cm resolution).Surface reflectance and temperature imagery were
obtained, after atmospheric corrections with MODTRAN. Biophysical
parameters were estimated using namely normalized difference vegetation
index. This paper demonstrates that results obtained with a low-cost UAV
system for agricultural applications yielded comparable estimations, if not
better, than those obtained by traditional manned airborne sensors..
Beatriz Ribeiro et al. (2010), [27], Imagery with a high spatial and spectral
resolution (8.0-13 μm) from the SEBAS airborne sensor was employed to
map tree canopy characteristics at the State Arboretum of Virginia, near
Boyce a spatial resolution of approximately 1 m2 per pixel.on July 6, 2007.
Fifty organizations were examined and approximately half were found to
have spectral fingerprints with varying degrees of success against their
target signatures. Measurements of the spectrum of extracted SEB
emissivity correlated favorably with species spectra. Some of the best
results were obtained by species with high spectral contrast leaves, close-
planted vegetation, and broad or flat and wide canopies. The fact that the
Chapter One General Introduction
15
tree species had small leaves and / an undesirable leaf orientation indicated
possibly cavity blackbody absorption. Increased spatial resolution and better
image calibration can help improve thermal infrared species identification.
This study shows that hyperspectral imaging can be used to distinguish
various plant species from a distance. Prerequisite techniques for good
results include using only the measurements taken from the data, verifying
that they accurately represent chemical and species differences, (emissivity)
within the study area, and using laboratory spectra to find these emissivity
signatures.
Fouad K. et al. (2012) ,[28],The main objective of this paper is to study the
vegetation two deferent years (1990 and 2002).The study area is in Iraq
west of Baghdad within longitude (37.36) (36.87) northwards and latitude
(41.10) (39.50) eastwards.Two scenes of satellite image had been used in
different times, Landsat-5 Thematic Mapper (TM) 1990, and Landsat-7
Enhanced Thematic Mapper plus (ETM+) 2002.the results indicated that the
state of vegetation in 1990 was in the proportion of 42.8%, while this
percentage rose to 52.5% for the same study area in 2002 By referring to
statistical values for the classification process .
S. Ullah (2013),[7], in this study the potential of mid-wave and thermal
infrared spectroscopy is studied in order to identify plant species and
estimate leaf water content. This study was conducted in Enschede, the
Netherlands, between July and September, 2010. Leaves were collected
from a total thirteen plant species , eleven of which were local and two were
tropical. It is concluded that laboratory emissivity spectra of different
vegetation species contain sufficient information to discriminate vegetation
types and have potential for floristic mapping. The results of the study
showed that vegetation types have characteristic emissivity signatures
which are statistically significantly different from other species. with the
Chapter One General Introduction
16
advancement of hyperspectral MIR and TIR sensors, implement it in the
field and at airborne level.
M. Pandya et al. (2013 ), [29], Radiance and emissivity measurements of
eight plant species (maize, pearl millet, sorghum, tobacco, groundnut,
cotton, and soybean) have been recorded farm of Anand, India, .around 4-
12 μm in wavelength in the TIR spectral range . Emission spectral features
in the leaf such as cellulose and xylose were observed in the field. Surface
structural characteristics like trichromacy and texture can influence
emissivity. The spectral measurements concerning emissivity spectra were
accompanied by corresponding biochemical research. Preliminary field
observations reveal that there might be valuable spectral information that
can be extracted but is difficult to see in terms of direct sensing,.
Ishimwe et al. (2014), [30], In recent years, there has been a great rise in
the importance of thermal imaging for the various fields of agriculture, from
germination, irrigation, salinity, plant disease and fruit/veget yields, food
evaluation, and measuring growth, among others. This technique has more
temporal and spatial granularity in agriculture. However, as much as it is
needed, intensive research is required for other agricultural processes that
remain unexamined (e.g., crop yield forecasting), despite its ability to be
used in various agronomic operations during pre-harvest and post-harvest
seasons.
E. Neinavaz (2017), [19] ,The satellite data were acquired on 9 August 2015
from Landsat-8 for the study for the Mixed mountain forest of Bavarian
Forest National Park (BFNP) which is located in a southeastern part of
Germany Bands (10 and 11) in the TIR region were acquired at (100 m)
resolution and were resampled to (30 m) to match with the OLI spectral
bands. Coucoluction Leaf Area Index from Thermal Satellite Imagery over
Mixed Temperate Forest. Results demonstrate that TIR satellite data can
achieve reasonable accuracy for LAI retrieval using VNIR/SWIR
measurements and that the combination of reflectance and emissivity data
Chapter One General Introduction
17
increases the retrieval accuracy of LAI. This finding has implications for the
retrieval of other vegetation parameters using TIR satellite remote sensing.
Y. C. Bukheet et al. (2018),[31], The study main the land cover changes
for "Baghdad city"over a period of (30) years using multi-temporal Landsat
satellite images (TM,ETM+ and OLI) acquired in 1984, 2000, and 2015
respectively.methodolog consideration different image pre-processing
techniques including, geometric correction, radiometric correction,
atmospheric correction and satellite image clipping. uesing The principal
components analysis transform , (i.e. enhancement, compressor, and
temporalchange detector),using calssification supervised method
(Maximum likelihood Classifier Technique) using ENVI 5.1,The results the
first PCA layer is responsible for more than 95% of the total variation for
each satellite image (95.8387%, 96.2976% and 99.1282% of the total
variance for 1984, 2000 and 2015 Landsat images respectively). Therefore,
the first principle component is suitable for classifying the multiband
satellite images because, its contain the most ground information in study
area and can be used to compare the changes in land cover classes in
different years.
M. F. Allawai (2020), [32] The study aims is the assessment of changes in
the land cover within Mosul City during the period (2014-2018). using
Satellite images of the Landsat 8,cover change of the area was identified
using Normalized Difference Vegetation Index (NDVI) techniques
,classification of satellite image using Maximum likelihood classifier for a
period (2014-2018), Green Normalized Difference Vegetation
Index,Highest NDVI value was found in 2015 (6.26%) which denotes
presence of moderate- high vegetation cover at that time period. After 2015,
highest NDVI value was found following a decreasing trend (6.05% in 2016
and 5.96% in 2018) which clearly represents the vegetation cover change in
the study area.
Chapter One General Introduction
18
1.10 Thesis Layout
The thesis includes the following four chapters as in following;
- Chapter One: illustrate the general introduction to remote sensing and
vegetation monitoring using thermal sensors.
- Chapter Two: illustrate theoretical background; it includes theoretical
background and the special methods in the topic of the study.
-Chapter Three: It was about the results and discussing the results.
-Chapter Four: presents the most important conclusions, as well as the most
important proposals for future work.
Chapter Two
The Theoretical Background
Chapter Two The Theoretical Background
19
Chapter Two
The Theoretical Background
2.1. Introduction
Remote sensing has advanced from an experimental to an operational
level over the previous few decades. The number of earth observation satellites
has expanded substantially, as has the sophistication of tools and processing
techniques, as well as the application of data for novel purposes. However,
historically, most efforts have focused on optical data and, more lately, on
microwave data. The existing literature demonstrates how little data has been
used in the thermal infrared region's science and application communities.
Thermal data is used infrequently due to a variety of variables, including sensor
capacity constraints, the data's availability, and general unwillingness to
investigate the potential for thermal distance sensing. This study discusses the
fundamentals and challenges of thermal remote sensing and illustrates how
thermal data can be used in vegetation applications. Thermal data is explored
in terms of its advantages and disadvantages, as well as the possibilities for
thermal remote sensing, particularly in connection to future high-resolution
satellites. [33].
All natural features reflect and emit radiation. Earth's thermal radiations are far
more intense than solar reflected radiations in the TIR area of the
electromagnetic spectrum. Hence, sensors operating in this wavelength region
detect mainly the soil's thermal radiative properties. However, hot features also
emit large amounts of radiation at shorter wavelengths. When thermal remote
sensing is used to measure radiation, the thermal remote sensing realm for high-
temperature pheasants encompasses the TIR and the short wave infrared, near-
infrared, in some cases, the visible region of the electromagnetic spectrum.
Thermal remote sensing is fundamentally different from optical and microwave
Chapter Two The Theoretical Background
20
remote sensing. Thermal data are used to complement other types of remote
sensing data in practice. Thus, thermal remote sensing holds promise for a
variety of applications that have not yet been thoroughly explored, [33].
2.2. Thermal Sensing Concepts and Spectrum
Ground features are emitting radiation that is measured by thermal
remote sensing to estimate the temperature of the emitted radiation, which
depends on the kinetic temperature󰇛󰇜 and the emissivity (ε). Fig. (2.1)
Diagram showing the factors on which the radiation temperature depends,
[34].
Figure (2.1): diagram of factors controlling the radiant temperature.
Kinetic temperature
Emissivity
RADIANT
TEMPERATURE
Heat budget
Thermal properties
Thermal
diffusivity
Heat capacity
Thermal
conductivity
Specific heat
Thermal inertia
Solar Heating
Longwave
upwelling
radiation
Down welling
radiation
Anomalous heat
sources
Solar
elevation
temperature
Atmospheric
condition
Cloud cover
Topographic
altitude
Chapter Two The Theoretical Background
21
The concepts and measurements which underlie the estimates of radiant
temperature from thermal remote sensing data are:
1. wavelength
The infrared region of the electromagnetic spectrum is typically described
as (0.7to1,000) μm. Within this infrared region, there are numerous
nomenclatures and little agreement on how to define the sub boundaries among
various groups. In terrestrial remote sensing, the wavelength range of (3 to 35)
μm is commonly referred to as thermal-infrared. As is the case for all other
remote sensing missions, data collection occurs only in regions with the least
spectral absorption, referred to as atmospheric windows. An excellent
atmospheric window exists for all (8-14) μm wavelengths in the thermal
infrared. Poorer windows are located between (3-5) µm and (17-25) μm. The
interpretation of data in the (3-5) μm range is complicated by interference with
solar reflection in day imagery, and the (17-25) μm range remains largely
unexplored. Thus, the area between (8-14) μm has been of particular interest
for thermal remote sensing,[35].
2. Kinetic Temperature
Anything above absolute zero (0 K =273°C =459 °F) emits radiation
within the infrared region, the value of energy emitted depends on the kinetic
temperature of the surface and its emissivity. Emissivity is the ability of a real
substance to emit light, compared to a black body. It is a spectral property that
varies with the material's structure and geometric configuration. Emissivity
denoted by epsilon (ɛ) is a ratio that ranges between (0 ,1). It is usually between
(0.7 , 0.95) for the majority of natural materials. The kinetic temperature of a
body/ground is a measure of the amount of heat energy stored inside. It is
quantified in a variety of units, including Kelvin (K); degrees Celsius (°C); and
degrees Fahrenheit (°F).
Chapter Two The Theoretical Background
22
3. Radiant Temperature ()
Is the exact temperature measured through remote sensing and is
dependent on the body's physical or kinetic temperature () and its emissivity
(ε). The following equation describes the total radiations emitted by natural
surfaces (nonblack body):


………… (2-1)
Where W this is total radiations emitted, σ this is Stefan-Boltzmann's const., ε
this is Emissivity, this is Kinetic Temperature, this is Radiant
Temperature.
 ………. (2-2)
From equation (2-2), the natural materials have emissivity 1(nonblack
bodies), the radiant temperature of the natural materials is usually less than the
real surface temperature.
Figure (2.2): Represents the Relationship of the Intensity of Black Body
Radiation with the wavelength. Each curve relates to different black body
temperatures,[19].
Chapter Two The Theoretical Background
23
Thermal radiation is the mechanism by which energy is released in all
directions from a heated surface in the form of electromagnetic radiation and
travels directly to its point of absorption at the speed of light; thermal radiation
does not need an intermediary medium to carry it. Thermal radiation has a
wavelength range ranging from the longest infrared rays to the shortest
ultraviolet rays. Within this range, the strength and distribution of radiant
energy are determined by the temperature of the emitting surface. The form of
the light emitted at each wavelength determines the black body's emission
spectrum, as shown in Fig. (2.2), which is temperature dependent.
The thermal property of a material is indicative of the material's upper
few centimeters of the surface. Since thermal remote sensing measures emitted
radiations, it complements other remote sensing data. It is also special in
identifying surface materials and features such as thermography in agriculture.
This includes nursery tracking, irrigation scheduling, soil salinity detection,
disease and pathogen detection, yield estimation, maturity assessment, and
bruise detection.
2.3. The Operational Land Imager
Landsat sensors have benefited from the OLI technique, having been
applied to NASA's experimental EO-1 satellite, which expands the resolution.
It uses an opto-lift to drive a rotating telescope which has a four-mirror sensor
and 12-bit quantization levels. Measures wavelengths ranging from visible to
shortwave to near and mid-infrared, the land detail information, developed by
Ball Aerospace and a few years ago, but not commercially adopted, in favor of
longwave technology. 15 m panchromatic and 30-multi-spectral coverage over
a (185 km) field of the Earth' Landsat's nearest to the equatorial orbit allows for
the entire Earth to be visible from space every 16 days A "filtered reflected
band", designed particularly for cirrus clouds, is one of the two new OLI
spectral bands provided by OLI and is used to assist in detecting coastal
observations, [36].
Chapter Two The Theoretical Background
24
OLI Requirement
The OLI specifications called for a sensor that would gather data in nine
spectral bands at a spatial resolution of 30 m (15 panchromatic) for around 185
km wide initially planned for use with a 705 km-altitude spacecraft. As critical
specifications are five-year design life, band width, radiometric precision, radio
stability, process reference levels, and transfer techniques must be defined. For
delivery, co- to-in site performance and calibration algorithms must be
included. Additionally, the transfer must include process and reference
requirements with respect to performance and radiometric results.
OLI Design
The OLI sensor is a push broom that utilizes a focal plane with long arrays
of photosensitive detectors. A four-mirror anastigmatic telescope concentrates
incident radiation onto the focal plane while providing a 15-degree field of view
that spans a 185 km -wide swath of ground, as shown in Fig.(2.3). The
multispectral digital images are generated by sampling the across-track
detectors on a periodic basis as the observatory advances along a ground track.
The detectors are divided into 14 modules that alternate along the focal plane's
centerline. Of spectral band is covered by approximately 7000 detectors, with
the exception of the 15 m panchromatic bands, which need over 13,000
detectors. Interference filters arranged in a "butcher-block" pattern over the
detector arrays in each module provide the spectral distinction.(Sip IN)
detectors are used to collect data in the visible and near-infrared spectral bands,
while (MgCdTe) detectors are used to collect data in the shortwave infrared
bands, [37, 38].
Chapter Two The Theoretical Background
25
Figure (2.3): The OLI Sensor on Board Landsat 8,[39].
2.4. The Normalized Difference Vegetation Index (NDVI)
Using measurement from various parts of the electromagnetic spectrum,
the Vegetation Index (VI) produces knowledge about vegetation. Most
vegetation indices are based on measurements of the high infrared and low red
reflectance values, which are described by the ratio of reflectance to
chlorophyll. NDVI has found broad use in monitoring various plant and tree
cover types on a regional and continental scale, [40].
The (NDVI) was used to distinguish vegetation cover from other land
cover types and determine its density. It also allows identifying and visualizing
vegetation areas on the map as well as detecting abnormal changes in the
growth process. So is commonly used to quantify the growth and health of
vegetation in a given location, [41]. Green leaves absorb incoming solar
radiation in the photosynthetically active radiation spectral region, which
provides the energy needed to power photosynthesis. More specifically, green
leaves absorb incident solar radiation very strongly in the blue and red spectral
regions and not as strong in the green spectral region. In the near-infrared
spectral region, green leaves are highly reflective and no absorption occurs.
Chapter Two The Theoretical Background
26
Thus, green leaves have high visible light absorption together with high near-
infrared reflectance, [42].
It represents the ratio of the difference between the spectral reflectance at the
near-infrared wavelength and the red wavelength of their total. Equation (2-3)
is the formula used to calculate NDVI:
 󰇛󰇜
󰇛󰇜 ………. (2-3)
NIR is the near-infrared
R is the visible red
The NDVI values range between (−1 to 1)
Bare soil, cloud, snow, and concrete have NDVI values close to zero,
while water has negative NDVI values and vegetation has positive. Due to the
fact that NDVI is a spectral measure of photosynthesis occurring within a given
spatial region, the value typically increases during the growing season. It then
decreases during the plants' senescent time. Additionally, NDVI will change
year to year as a result of climate changes such as rainfall or temperature during
previous seasons. As a result, researchers must take care to ensure that the
NDVI imagery used matches the timeline of other associated data, [41].
Factors that affect the calculation value of NDVI, [43].
1. Effects on the atmosphere: The real atmosphere composition will greatly
influence measurements in space (including water vapor and aerosols).
Therefore, the above can be misinterpreted if these consequences are not
taken into proper consideration (as is the case when the NDVI is
calculated directly based on raw measurements).
2. Clouds: Deep (optically dense) clouds are readily visible in satellite
imagery and produce distinctive NDVI values that aid their detection.
However, thin clouds (such as the ubiquitous cirrus cloud) or small
Chapter Two The Theoretical Background
27
clouds with normal linear dimensions less than the diameter of the region
sampled by the sensors can significantly contaminate the measurements.
Similarly, cloud shadows in otherwise clear areas can distort NDVI
values, resulting in misinterpretations. By constructing composite
images from regular or near-daily images, these factors are minimized.
Composite NDVI images have facilitated the creation of a plethora of
new vegetation applications in which the NDVI or photosynthetic
potential varies with time.
3. Soil effects: Since soils darken when wet, their reflectance is directly
proportional to their water content. Suppose the spectral response to
moistening is not identical in the two spectral bands. In that case, an
area's NDVI may appear to change due to changes in soil moisture
(precipitation or evaporation), rather than due to changes in vegetation.
4. Anisotropic effects: All surfaces (natural or man-made) reflect light
differently in different directions. This type of anisotropy is typically
spectrally based, even though the overall tendency in these two spectral
bands is identical. As a consequence, the NDVI value may be dependent
on the target's specific anisotropy and the angular geometry of
illumination and observation at the time of the measurements, and thus
on the position of the target of interest within the instrument's swath or
the time of the satellite's passage over the site. This is especially critical
when analyzing Advanced Very High Resolution Radiometer (AVHRR)
data, as The National Oceanic and Atmospheric Admintoration (NOAA)
platforms' orbits tended to drift over time. Simultaneously, the use of
composite NDVI images mitigates these issues, resulting in global time
series NDVI data sets spanning more than 25 years.
Chapter Two The Theoretical Background
28
2.5. Thermal Properties of Vegetation
The Plant- and vegetation-covered areas have different thermal properties
than built-up and hard-surfaced unplanted areas. The primary distinctions are
as follows:
Plants have a lower heat potential and thermal conductivity than hard surfaces
and construction materials. Solar radiation is absorbed mostly by the leaves,
resulting in very little reflected radiation. The soil absorbs rainwater. Later,
water evaporates from the soil and, more specifically, the leaves. Green areas
evaporate at a much faster pace than unplanted, hard-covered areas. Plants
reduce wind intensity and variations near the ground. As a result, the
microclimate inside and near green areas is distinct from that found in
unplanted, developed areas. Temperature, wind velocity and turbulence, air and
radiant temperatures, humidity, and air cleanliness are the primary variations.
Plants' leaves consume most of the solar radiation that strikes them. They
convert a very small portion of the radiant energy into chemical energy by
photosynthesis, thus marginally reducing the rate of urban space heating. The
amount of vegetation present at a location has a direct effect on the local
environment's temperature.
Given that the majority of any vegetation is made up of leaves, vegetation's
effect on the environment's temperature is largely due to leaves. In plants,
leaves are the primary organs responsible for photosynthesis, transpiration, and
food production. Leaves also provide useful knowledge about the
morphological and physiological state of developing plants. Leaf
characteristics such as scale, shape, thickness, venation, the composition of the
surface, water content, photosynthetic, and anatomical properties vary
significantly between plant species. By and large, a dense canopy of vegetation
or leaves results in a cooler climate, particularly in tropical regions. However,
it is very difficult to quantify this temperature drop for two reasons: 1-The
temperature drop caused by this is very slight, perhaps on the order of 0.1 C.
Chapter Two The Theoretical Background
29
2- Wind-driven air movement smears out the temperature difference, which is
already very minimal. It could detect a temperature difference of up to 0.5
degrees Celsius between a location surrounded by dense vegetation and a
nearby location devoid of vegetation. The thermal properties of plant leaves
enable them to regulate the temperature of their surroundings. It is well
established that heat transfer between a plant and its environment occurs
primarily through three processes: (a) conduction and convection as a result of
direct interaction with the air, (b) evaporation of water as latent heat, and (c)
heat loss by radiation,[44]. Thermal conductivity and basic heat capacity are
two critical thermal properties for any material. Two less visible thermal
properties are the material's thermal diffusivity and thermal effusivity. By
definition, thermal diffusivity measures the rate at which thermal energy
spreads through a material. It is expressed in units of .. Via the
relationship(2-4), [45].

 ………. (2-4)
this is thermal diffusivity, λ this is thermal conductivity, ρ this is density,
this is specific heat capacity
Thermal emissivity is inversely proportional to specific heat capacity, which
largely defines the magnitude of the thermal wave at the sample surface, [3].
2.6. The Digital Image
Typically, a remotely sensed digital image is composed of picture elements
(Pixels) positioned at the intersection of each row I and column j in each K
band of imagery. Each pixel is assigned a number called a Digital Number (DN)
or a Brightness Value (BV), which represents the average radiance of a
relatively small area within a scene, as illustrated in Fig. (2.3). A lower number
indicates that the region has a low average radiance, while a higher number
indicates that the area has high radiant properties, [18]. The scale of this region
Chapter Two The Theoretical Background
30
influences the way specifics inside the scene are reproduced. The smaller the
pixel size, the more detail is provided in the visual depiction of the scene.
Figure (2.3): Structure of a Digital Image and Multispectral Image,[46].
2.7. The Digital Change Detection
The identification of digital changes is a critical stage in all remote sensing
processes; these digital changes can be spatial, spectral, or temporal in nature.
The fundamental principle underlying the use of remote sensing data for change
detection is that changes in the land cover must result in changes in radiance
values. Those changes in radiance must be substantial compared to changes in
radiance caused by other factors, [47]. These 'other' variables include variations
in air conditions, variations in the Sun's angle, and variations in soil moisture.
The effect of these variables can be mitigated in part by choosing appropriate
data. For instance, Landsat data collected at the same time of year may help
mitigate problems caused by differences in sun angle and vegetation
phenology. Numerous researchers have attempted to address the change
detection problem by using digital satellite data: Numerous methods for
detecting changes in land cover using digital data have been suggested, which
may help in the updating of resource inventories. These methods include land
cover classification comparison, multidate classification, image differentiation
/ ratioing, vegetation index differentiation, principal component analysis, and
shift vector analysis.
Approaches to detecting digital changes can be narrowly classified as follows:
(1) The data transformation process.
Chapter Two The Theoretical Background
31
(2) Methods for delineating areas of major change.
There are two primary techniques for detecting changes: a comparative analysis
of classifications made independently for different dates and concurrent
analysis of multi-temporal data. It is worth emphasizing at this point that the
bulk of change detection algorithms require precise spatial registration of the
two images. Aligning the images with one another or with a regular map
projection necessitates the employment of geometric rectification methods,
[48].
Additionally, as we will see later, the majority of approaches involve a
decision about where to establish threshold boundaries to denote areas of
change from areas of no change.
2.7.1 .The digital change detection conditions
1. The satellite images must be of the same resolution as the spatial
distinction, even if they are from different sensors.
2. For example, the virtual units must be of the same quantization level,8 bit
.
3. The spatial coordinates of two or more images on the ground must be
identical, and this requires that the value of the coordinates be equal to
the upper point on the left side than the lower point on the right.
4. The atmospheric effects must be eliminated in two or more images, and
this is done using the radiometric assignment.
5. The mathematical conditions must be taken into account in the used
equations in the calculation of numerical covariance, for example, when
using subtraction of images, negative values must be treated, as well as
in the case of dividing images, division by zeros must be avoided.
Chapter Two The Theoretical Background
32
2.7.2. The Thresholding
If the background is dark, the image is (no change), while the (change) is there
if an image contains light objects. A simple thresholding can be used to extract
these 󰇛󰇜󰇛󰇜
󰇛󰇜  ……….. (2-4)
where I(x,y) this is represent the image, T is the threshold value, all pixels
associated with the object (change) are coded 1. In contrast, those associated
with the backdrop (no change) are coded 0. If more than one threshold is
desired, the technique of density slicing may be used. This technique groups
numerous items with varying pixel values into predefined slices. While grey
level thresholding can be performed interactively using a monitor and an
operator-controlled pointer, the ideal threshold level selection needs typically
be coupled with prior knowledge about the scene or visual interpretation to be
useful [49]. The threshold values may also be derived from the histogram of
the image.
2.6.3. Image Band Ratioing
Rationings are regarded as a reasonably quick method of determining
areas of transition. Rationing is the process of comparing two recorded images
from separate dates that contain one or more bands. Pixel by pixel, the data is
compared, [50].


󰇛󰇜

󰇛󰇜 ………… (2-5)
Where,
󰇛󰇜 this is represent pixel value for pixel x of band k at row and
column (i, j) at time, 
󰇛󰇜 is the pixel value for pixel x of band k at row
and column (i, j) at time.
If the reflected energy intensity is nearly identical in each image, then 
=1,
indicating no change.
Chapter Two The Theoretical Background
33
In areas of change, the ratio can be substantially greater than or less than one,
depending on the extent of the differences between the two dates. The
methodology's critical step is to choose acceptable threshold values in the
distribution's lower and upper tails, representing shift pixel values, [51].
Ratioing has been criticized as a method of detecting changes due to the non-
normal distribution upon which it is centered, [52]. If non-normal distributions
are used to differentiate change from non-change, the mode-delimited areas are
different. Consequently, on either side of the mode the error rates are not the
same.
2.8. Principal Component Analysis as Change Detector
Remote sensing images can be used to detect changes in the state of the
land's surface. Numerous approaches for change identification have been
proposed and implemented, [53]. Principal component analysis (PCA) is a
critical data transformation technique that is used extensively in remote sensing
applications involving multi-spectral data. The PCA is a useful statistical
technique that has found use in a variety of fields, including face recognition,
image compression, pattern discovery in large-dimensional data, and change
detection. Change detection is a broad term that refers to a technique used in
remote sensing that compares imagery collected over the same region at various
times and highlights features that have changed. The PCA approach employs
Principal Component Analysis to identify areas that have changed, [53].
For best results, use images with similar view geometries. Different view
geometries may cause objects such as trees and structures to "lean" in different
directions. Because these issues cannot be resolved in coregistration, they cause
artifacts in the results. Where coregistration occurs within the change detection
and pan sharpening tools. For change detection or pan sharpening to be
effective, the images of interest must be closely aligned,[54, 51, 52].
Chapter Two The Theoretical Background
34
This technique can be evaluated as in the following stages;
1- Merge the two or more spectral images in one image file after the above
change detection conditions will be right.
2- Apply the PCA kernel on a merged image.
3- The first PC is always the no change band, the second and/ or three PCs
are the change bands. The selection of PC2 or PC3 depends on the No.
of bands in the merge image file.
4- The other PCs are noise.
2.9. The Image Classification
The purpose of image classification procedures is to categorize all pixels in
an image into land cover groups or themes. Image classification is also achieved
using spectral patterns; that is, pixels that share similar spectral reflectance or
emissivity combinations are grouped into groups that are supposed to represent
specific categories of surface characteristics, [57]. In general, there are two
main methods of classification such as;
2.9.1. Supervised Classification
This method is mostly employed a classification algorithm in remote
sensing. The training sample is the most powerful component of the supervised
Landsat Image Classification technique. Accuracy of classification depends a
great deal on the samples decided for training, after applying this method, the
performance is a classified picture, considering each class named training in
the type. In the supervised classification, the user must select the region of
interest that acts as a classifier on the map. The pixels of the entire picture are
categorized according to the zone for the study area. In remote sensing, when
applying the supervised classification, the maximum likelihood classification
approach used in this analysis was one of the pixel-based supervised
classification techniques. In this technique, training places are chosen by the
Chapter Two The Theoretical Background
35
user and this approach is based on a basic pattern, predicted on the likelihood
of a pixel being fitted to a specific category or class. Its basic approach assumes
that, for all groups, the probabilities are the same, [55,56] the classification
scheme that is most often used in remote sensing, as shown in Fig. (2-4).
Figure (2.4): The Supervised Classification Steps.
The Supervised Classification Steps are, [60];
1. Settle on the category of land cover forms that the picture will be divided
into. These are the knowledge groups that can include but are not limited to
water, urban areas, soil, and vegetation.
2. Select prototype or representative pixels from each required class category.
These pixels are chosen to represent a region of interest (ROI) or serve as
training data. Each class has training groups and it can be determined via
aerid photos, site visits, charts, and even photo analysis through color
compound output generated by the data of the image. Frequently, the region
of interest (ROI) of training pixels for a given class is bounded by a
boundary. This region is often referred to as an area of interest (ROI).
3. Utilize the training data to estimate the parameters of the classifier algorithm
that will be used; these parameters will either be model properties or
questions that describe parts in the multi-spectral space. Occasionally, the
group of factors for a specific class is referred to as the signature of class.
Chapter Two The Theoretical Background
36
4. Utilizing the qualified classifier, mark or identify each image’s pixel as a
suitable ground cover type (information classes). Typically, the entire picture
section of interest is defined in this way. While the training in point 2
requires the user to manually specify up to ~1% pixels of an image, the
machine will determine the remainder using classification.
5. Create summaries as tables or schematic (class) maps that summarize the
results of classification.
6. it may be important to refine the training method based on the results
obtained at Step 6 in order to enhance classification accuracy Assess the final
product's consistency using a labeled testing data collection. In practice.
2.9.1.1. Maximum Likelihood Classification
Maximum likelihood classification is the most frequently used supervised
classification technique for image data collected via remote sensing. This is
statistically reasonable method; however, it can be extracted in normal and
reliable method, where in the Bayes' classification, the image's spectral classes,
[61], can be represented as bellow  , M is the classes total
number. In order to precisely describe the class or group to which a pixel vector
x belongs, conditional probabilities are used.
󰇛󰇜 
The column containing the pixel's brightness values is a measurement
vector x. It portrays the pixel in multi-spectral spacepoint with axes described
by the image’s brightness, as shown in Figure (2-4) of the simple two-
dimensional example. The probability 󰇛󰇜 gives the probability that the
right class for a pixel at position x is, [61]. Classification is carried out by
 󰇛󰇜 󰇛󰇜  ………. (2-6)
i.e., if󰇛󰇜 is the largest, the pixel at x belongs to the class . This
axiomatic judgment way is a particular instance of a more general way under
Chapter Two The Theoretical Background
37
which the resolution could be skewed by the various degrees of value
associated with various incorrect classifications. Under the framework of the
Maximum Likelihood Decision Rule. Despite its simplicity, the value of
󰇛󰇜in equations (2-6) is unknown, [61]. Assume there is disagreement
about the availability of appropriate training data for each type of cover of land.
This can be utilized to evaluate a possibility distribution for a cover form that
characterizes a possibility of locating a pixel from class I at a given location,
say x. Later on, the shape of this distribution mechanism will be refined.
However, for the time being, it will be maintained in broad terms and
denoted by the symbol󰇛󰇜. There will be an equal number of 󰇛󰇜as
ground cover groups. As could be argued, in other words, for a pixel located at
location x in multispectral space, a set of probabilities can be computed that
represents the proportional probability that the pixel belongs to each attainable
class. Bayes' theorem relates the ideal 󰇛󰇜in Eq. (2-6) and the available
󰇛󰇜estimated from training data:
󰇛󰇜 󰇛󰇜󰇛󰇜󰇛󰇜 ………. (2-7)
Where󰇛󰇜 denotes the potential that class occurs in the image. If
15% of the pixels in an image belong to class I, for example, 󰇛󰇜n = 0.15;
p(X) in equations (2-7) represents the possibility of experiencing a pixel at a
specific class at a certain position x. It's worth noting in passing that
󰇛󰇜󰇛󰇜󰇛󰇜
 ………. (2-8)
While P(X) in and of itself is irrelevant in the following. The 󰇛󰇜
values are referred to as a priori or previous probabilities because they represent
the probabilities in which a pixel's class membership could be speculated prior
to classification. By contrast, 󰇛󰇜 represents posterior probabilities.
Equation (2-7) demonstrates that the classification rule for equation (2-6) is as
follows: 󰇛󰇜󰇛󰇜 󰇛󰇜 …… (2-9)
Chapter Two The Theoretical Background
38
Where 󰇛󰇜 has been omitted as a known factor. The way of equation
(2-9) is more acceptable than that of equation (2-6) since the 󰇛󰇜are known
from training data, and it is imaginable that the are also known or
predestined from the analyst’s knowledge of the image, [59,60]. Mathematical
convenience results if in equation (2-9) the definition
󰇛󰇜󰇝󰇛󰇜󰇛󰇜󰇞 ………. (2-10)
󰇛󰇜󰇛󰇜 󰇛󰇜………. (2-11)
Is used, where ln denotes the normal logarithm, rewriting (2-9) as
 󰇛󰇜 󰇛󰇜 ………. (2-12)
This is the decision rule used in maximum likelihood classification, with one
modification; the (X) are represented as discriminant functions. As described
in Fig. (2-5), two-dimensional multi-spectral space with changed spectral
classes is constructed in this manner.
Figure (2.5): Two-Dimensional Multispectral Space with Gaussian
Probability Distributions Representing the Spectral Groups,[64].
Chapter Two The Theoretical Background
39
2.9.1.2. Minimum Distance Classification
The Maximum likelihood classification is efficient when the mean
vector m and the covariance matrix for each spectral class are estimated fairly
accurately. This is contingent upon a sufficient number of training pixels being
available for each of those courses. When this case is not, in-accurate guesses
of the elements of outcome, resulting in incorrect classification. In class, the
training samples number is restricted. It can be efficient to use a classifier that
ignores covariance information and instead relies on the mean positions of the
spectral groups, which can be more precisely predestined than covariance,
[65]. A similar strategy is known as the classifier of minimum distance, or to
be exact, classifier means the minimum distance to class. This classifier uses
training data to identify class means; classification is then done by assigning a
pixel to the class with the closest mean. The minimum distance algorithm is
also more granular than maximum likelihood classification since it is a faster
technique. It is, however, less versatile than the latter due to the absence of
covariance data. Maximum likelihood classification models each class using a
multivariate normal class model that is capable of accounting for data
publication in specific spectral directions. Since the minimum distance
technique does not use covariance data, class models are symmetric in the
spectral domain. As a result, elongated groups would be poorly modeled. Other
than that, some spectral classes might be required to be used for this algorithm
when only one is suitable for maximum likelihood classification. The creation
of the discriminate function for the classifier of minimum distance is created,
[61, 62].Assume, , M= represent categories that are defined by
data provided by training, x this is represent the location of the classified pixel.
Count the group of squared distances of the unknown pixel for the class refers,
specified in the form of vector as
󰇛󰇜 󰇛 󰇜󰇛 󰇜 ………. (2-13)
󰇛󰇜 󰇛󰇜󰇛󰇜………. (2-14)
Chapter Two The Theoretical Background
40
By expanding
󰇛󰇜   ………. (2-15)
The classification process is based on the following criteria:
󰇛󰇜 󰇛󰇜
Note that is common to all 󰇛󰇜 and thus can be removed.
Moreover, rather than classifying according to the smallest of the remaining
expressions, the signs can be reversed and classification performed based on
 󰇛󰇜 󰇛󰇜 ……… (2-16)
where 󰇛󰇜  ………. (2-17)
Equation (2-17) the discriminate function for the minimum distance classifier
is specified in this function. Unlike the maximum likelihood method, the
decision surfaces can distinguish between distinct spectral class regions in
multispectral space. Maximum likelihood classification's higher-order decision
surface makes it efficient for dividing multi-spectral space than the surfaces
(linear) used in the minimum distance method, [57,61]. Fig. (2-6)
Figure (2.6): Illustrates the Supervised Parallelepiped Classification Process,
Performed on Multispectral Bands (i.e., red band 3 versus near-infrared band
4).[67]
Chapter Two The Theoretical Background
41
2.9.2. UnSupervised Classification
This technique analyses and divides a large number of unknown pixels
into several groups. Natural groupings in the value of the image. Analyst-
specified training data is not needed for unsupervised classification. The
fundamental principle is that the measurement should be close together in a
given form of cover Room, where data should be comparatively well divided
into various groups. Unattended classifications are spectral groups based on the
normal grouping of image values, the identity of A spectral class would be
unknown initially, identify data from reference data (including the higher-
resolution imagery, maps, or on-site visits) must be compared to determine
identity and spectral class information values, [63,64].
Figure (2.7): The Unsupervised Classification Technique.
2.9.2.1. Isodata Classification Method
The means of the classes (i.e., those that are uniformly distributed in the
data space) are first determined in this classification system. The remaining
pixels are then clustered iteratively using minimum distance techniques. Each
iteration updates the means and reclassifies pixels concerning the updated
means. Class splitting, merger, and deletion are performed iteratively based on
the input threshold parameters (i.e., minimum and maximum number of
proposed classes, number of iterations, maximum standard deviation from the
means, and maximum error distance). All pixels are then grouped into the
closest class, unless a standard deviation or distance threshold is defined, in
which case some pixels might be unclassified if they do not meet the specified
criterion. This procedure is repeated until the number of pixels in each class
Chapter Two The Theoretical Background
42
increases by less than the chosen pixel shift threshold or until the maximum
number of iterations is reached, [70].
2.9.2.2. K-Means Classification Method
Similar to the isodata process, the K-Means unsupervised classification
calculates initial class means (i.e., uniformly distributed throughout the data
space) and then clusters the pixels using a minimum distance technique
iteratively into the nearest class. Each iteration updates the class means and
reclassifies pixels in accordance with the updated means. Unless a standard
deviation or distance threshold is defined, all pixels are categorized into the
closest class. At this point, some pixels may be unclassified if they do not meet
the specified criterion. This procedure is repeated until the number of pixels in
each class increases by less than the chosen pixel shift threshold or until the
maximum number of iterations is reached, [71].
2.10. Statistical Digital Image
The study of statistics is the gathering, arrangement, examination, and
interpretation of data. It covers all aspects of this, including data collection
preparation in terms of survey and experiment design. Mean, mode, median,
variance, standard deviations, covariance, skewness, and kurtosis are numerous
statistical measures. Many of these interventions are applied in a variety of
fields of science and social study, including biostatistics, computational
biology, computational sociology, network biology, social science, sociology,
and social research. The primary goal is to demonstrate the fundamental
application of these steps in various fields of digital image processing, such as
image enhancement, reconstruction, denoising, and edge detection, and to
facilitate the selection of statistical parameters for a particular image processing
technique, [72].
Chapter Two The Theoretical Background
43
2.11. Proposed Statistical Model
While research on a few of these measures has already been conducted at
a high level, we have proposed a simple statistical model in Fig.(2-8) for image
processing to optimize its functionality. The following measures comprise the
proposed statistical model,[73]:
1. Analysis of the input image, Ii (x, y): In this stage, the input image is
statistically analyzed using various metrics such as mean, mode, median,
variance, standard deviation, covariance, skewness, and kurtosis.
2. Statistical parameter selection: The statistical parameter is selected based on
the performance-optimized image specifications.
3. Image Filtering: The image is filtered using the statistical parameter chosen
in the previous step. Depending on the specifications, we can filter the image
using anything from a simple filter to a multi-parameter complex filter. The
following parts will explain how to conduct statistical analysis on a picture
using a variety of statistical measures.
Figure (2.8): A Simple Statistical Model for Image Processing to Optimize its
Functionality.
Statistical Analysis
Image Filtering based
on Selection of
Statistical Measure
Selection of Statistical
Measure
Ii(x, y)
Io(x, y)
Chapter Three
Results and Discussion
Chapter Three Results and Discussion
44
Chapter Three
Results and Discussion
3.1. Diagram of the Work Procedure
Figure (3.1): Diagram of the Work Procedure.
Chapter Three Results and Discussion
45
3.2. The Available Data
The satellite Images of Baghdad city are download from USGS (United
States Geological Survey) site. Fig. (3.2, A and B) represents a full Landsat
Series scene, period from 2000, 2010, 2015 , 2017. Also, table (3.1) presents
dates and some information of captured images of Baghdad city.
Figure (3.2, A): A full scene of Landsat Series
Chapter Three Results and Discussion
46
Figure (3.2, B): The Composite Bands of study area.
Table (3.1): some information of captured images of study area.
Source of Image
Date
Composites
Bands
Resolution
(m)
Landsat 5 Thematic Mapper
(TM)
Dec. 29th ,2000
B 5
B 4
B 3
30
Landsat 5 Thematic Mapper
(TM)
Dec. 9th , 2010
30
Landsat 8 OLI (Operational Land
Imager) & TIRS (Thermal
Infrared Sensor)
Dec. 7th , 2015
B 7
B 5
B3
30
Landsat 8 OLI (Operational Land
Imager) & TIRS (Thermal
Infrared Sensor)
Dec.12th , 2017
30
Composite- Dec. 9th , 2010
Composite- Dec. 7th , 2015
Composite- Dec.12th , 2017
Composite- Dec. 29th ,2000
Chapter Three Results and Discussion
47
3.3. The Satellite Image Subset
All images resized for a particular region of interest to reduce the size
and results extracting and display. The subset process was built on the geo-
coordinate methods system that includes: (the upper left point Lat. 33° 12′
1.79″, Long. 44° 30′ 46.05″ to lower right point Lat. 32° 55′ 51.61″, Long.
44° 50′ 5.45″) in the geographic coordinate system, unit degree, minute, &
second. These coordinates equivalent to (the upper left point 45457.5 E
3673635 N to lower right point 484545 E 3643665 N) in the UTM system, unit
meteric. The resultant images were of 1000x 1000. This image subset is
essential for all work steps as images will coincide and be in the same size.
Fig.( 3.2, A to D )show the subset image set. The subset was done for all bands.
Chapter Three Results and Discussion
48
Figure (3.3, A): Composite of b3, b4 and b5 Image, 2000.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
49
Figure (3.3, B): Composite of b3, b4 and b5 Image, 2010.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
50
Figure (3.3, C): Composite of b3, b5 and b7 Image, 2015.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
51
Figure (3.3, D): Composite of b3, b5 and b7 Image, 2017.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
52
3.4. Normalized Difference Vegetation Index (NDVI)
The Normalized Difference Vegetation Index (NDVI) helps distinguish
vegetation cover from other land cover types and determines its density. It also
allows identifying and visualizing vegetation areas on the map as well as
detecting abnormal changes in the growth process. Equation (1) is the formula
that used to calculate NDVI.
 󰇛󰇜
󰇛󰇜 ……….. (3.1)
Where;
NIR, are the digits number in the Near-Infrared band.
Red, are the digits number in the Red band.
Table (3.2): Information of Landsat bands.
source
bands
Wavelength(µm)
Landsat 5(TM)
3 (Red)
0.631-0.692
4(NIR)
0.772-0.898
Landsat8 (OLI)
4 (Red)
0.636-0.673
5 (NIR)
0.851-0.879
From equation 3.1, it is clear that the NDVI values range between (−1 to 1).
The dense vegetation refers to a more positive NDVI value, and the surface
without vegetation has a NDVI value close to zero or decreasing negative. In
general, the law NDVI values refer to Soil and other ground categories. The
results will appear in grayscale, whenever the color means white, the presence
of dense vegetation cover. Whenever it becomes less and tends to darken, the
plant has lightened until it turns black it means there is no vegetation in the
area, as shown in Fig. (3.4, A to D).
Chapter Three Results and Discussion
53
Figure (3.4, A): The NDVI Image, 2000.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
54
Figure (3.4, B): The NDVI Image, 2010.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
55
Figure (3.4, C):The NDVI Image, 2015.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
56
Figure (3.4, D):The NDVI Image, 2017.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
57
3.5. Classification of The NDVI Images
In this work, the classification for NDVI images of the study area was
applied for all the above periods. Using the supervised classification (minimum
distance classifier) shows the amount of vegetation in the study area. With
supervised classification, Regions of interest must be determined by the user
according to the image intensities. The classification process yields unclassified
regions and the around vegetable regions, as shown in Fig. (3.5, A to D). From
the statistical image calculation, the areas of vegetable areas were calculated
for each period and will be shown in table (3.3). For all Landsat imagery that
used in the research, the ground spatial resolution for each image is 30 m.
Therefore, each pixel in any image represents 900 square meters on the ground.
Table (3.3): The Vegetable Areas Calculation for the Classify NDVI Images.
Year
Vegetation Areas ( Km2 )
2000
43.3125
2010
37.4675
2015
9.1881
2017
22.77495
From the table(3.3), the area of vegetation appears to decrease from 2000 to
2015, this is the true fact due to the drought and low precipitation as well as the
dust storms. After 2015, the effects of the above factors will be decreased,
therefore the vegetation growth will increase.
Chapter Three Results and Discussion
58
Figure (3.5, A): The Classification NDVI Image by Supervised Classification
(minimum distance classifier), 2000 .
Vegetation
Nothing
Around Vegetation
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
59
Figure (3.5, B): The Classification NDVI Image by Supervised Classification
(minimum distance classifier), 2010.
Vegetation
Nothing
Around
Vegetation
Wat
er
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
60
Figure (3.5, C): The Classification NDVI Image by Supervised Classification
(minimum distance classifier), 2015.
Vegetation
Nothing
Around Vegetation
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
61
Figure (3.5, D):The Classification NDVI Image by Supervised Classification
(minimum distance classifier), 2017.
3.6. The Climatic and Weathering Factors
As we mentioned previously, climatic factors, such as temperature,
precipitation, evaporation and others influence the growth of plants. In this
research two climatic factors were consider in results interpretation such as,
Vegetation
Nothing
Around Vegetation
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
62
temperature and precipitation for the years from 2000 to 2017, the climatic
factors data obtained from the USGS" website. For temperature distribution,
the mean temperature for the hot months were considered, i.e. from May to
October. The amount of precipitation in the data were controlled from
November to April as in shown in Table (3.4).
Table (3.4): Represent Average Temperature and Average Rainfall of study
area.
Year
Average temperature
(˚C)from May to
October
Average Rainfall
(mm) from
November to April
2000
34.24
30
2001
33.44
26
2002
33.44
34
2003
33.64
22
2004
32.55
26
2005
33.09
16
2006
33.97
29
2007
33.44
20
2008
32.8
16
2009
31.81
13
2010
33.75
15
2011
32.84
16
2012
33.5
25
2013
31.6
33
2014
32.58
22
2015
33.06
11
2016
32.66
18
2017
33.11
31
Chapter Three Results and Discussion
63
From the tables (3.3) and (3.4) and for the years selected in the research,
the temperatures fluctuations are from 34.24 to 31.6 ˚C, the average of the max.
and min. temperature is 32.92 ˚C. The temperature is not the main factor that
influences the vegetation growth, so it is clear from the data that the
precipitations values are the main factor that affects the growth. The values of
rain and precipitations are decreased from 34 mm to 11 mm from 200 to 2015
and will be rise to 31 mm in 2017, so the vegetation growth increases. Figures
3.6 and 3.7 represent The average monthly temperature(˚C)and rainfall(mm)
of the study area during (1998 - 2018) years, respectfully.
Figure (3.6): The average monthly temperature(˚C)from May to October of
the study area during (1998 - 2018) years.
Figure (3.7): The average monthly rainfall(mm) from November to April of
the study area during (1998 2018) years.
Chapter Three Results and Discussion
64
3.7. The Digital Change Detection for Thermal Bands
At this point of view, the thermal bands of the available data were
investigated to extract the digital change detection in vegetable areas. The band
6 represents the thermal band in Landsat 5, the wavelength of the thermal band
(10.4 to 12.5 µm), a fusion spatial resolution of 30m. The thermal bands in
Landsat 8 are b10 & b11, the wavelength to the thermal band (10.6 to 12.5 µm)
and a spatial resolution of fusion 30 m. as shown in Fig. (3.8, A to D), please see
the description of each image. The size of the thermal bands is similar to visible
and NIR bands. Also, the locations and geomatic data are the same. In order to
cover the spectrum of thermal bands for TM5, (2000 and 2010), the mean of b10
b11 in TIR, (2015 and 2017) was done using a simple program in visual basic
6.0.
Chapter Three Results and Discussion
65
Figure (3.8, A):The Thermal band of TM5, 2000.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
66
Figure (3.8, B): The Thermal band of TM5, 2010.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
67
Figure (3.8, C): The Thermal band of TIR, 2015.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
68
Figure (3.8, D): The Thermal band of TIR, 2017.
3.7.1. Image Differencing
The first method of digital change detection for the vegetation growth is
the subtraction of thermal band image, such as; 2017 for 2015 thermal band
images 2017, 2010 from 2015, 2010 and 2000, and 2010 from 2000 for
Baghdad city, as shown in Fig. (3.9, A to E). This method was achieved by
Equation;
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
69
cbbchange nsubtructio 21
)(
………. (31)
where;
b1, b2: represents the thermal band images of various times.
c: represent a threshold value selected according to the result of change contrast
and quality.
The subtraction results in positive and negative values in areas of
radiance changes and zero values in areas where no changes occurred. The
more obvious change will be presented as to be brighter or darker features,
while the mid-gray pixels indicate relatively time invariant features. According
to equation (31), the bright features may present the more frequent changes,
while the darker area presents those invariant features. However, a suitable
threshold value may be suggested to differentiate between change-occurrence
or not occur, [63]. This threshold may be varied to improve the change-
predictability of the analysts. In the research, equation (31) has been utilized
to predict the differences between two co-registered sets of images. In fact, the
threshold value has been decided according to the resulted boundaries in the
differencing image (i.e., all difference values less than the threshold are
assigned zero values, while those above threshold are assigned 255 values).
Iteratively, a suitable threshold has been found to range between 70-85. To
reduce the undesirable boundaries, isolated pixels have been discarded, using a
33 mean filter before differencing operation.
Chapter Three Results and Discussion
70
Figure (3.9, A): The 2017 Thermal Band minus 2015 Thermal Band, with
threshold 70.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
71
Figure (3.9, B): The 2017 Thermal Band minus 2010 Thermal Band, with
threshold 85.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
72
Figure (3.9, C): The 2017 Thermal Band minus 2000 Thermal Band, with
threshold 77.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
73
Figure (3.9, D): The 2010 Thermal Band minus 2000 Thermal Band, with
threshold 77.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
74
Figure (3.9, E): The 2015 Thermal Band minus 2000 Thermal Band, with
threshold 77.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
75
3.7.2. Image Rationing
The second method of digital change detection of the vegetation growth
is the image rationing. Divided image 2017 over image 2010, 2010 over 2000,
2017 over 2010 and 2005 over 2000, Fig. (3.10, A to D) represent the results.
This method was achieved by equation (32) as follows;
121
)(
bb
change rationing
………. (32)
where;
b1, b2: represents the thermal band images of various times.
1: a value is used to avoid division by zero
Sometimes the differences in brightness values from similar surface
materials may be caused by topographic conditions, shadows, or seasonal
changes in sunlight illumination angle and intensity. These conditions may
hamper the ability of an interpreter or classification algorithm to correctly
identify surface materials or land-use in a remotely sensed image. Fortunately,
ratio transformations of the remotely sensed data can, in certain instances, be
applied to reduce the effects of such environmental factors. This algorithm
consists of computing the ratio of the values of corresponding bands. If no
changes occurred, the result is unity, while if changes existed in a particular
pixel, the ratio will be either considerably more or less than unity (depending
on the “direction” of the change).
A normalization operation may be applied to represent the range of the
function in a standard 8-bit format. Using this normalization function, the ratio
of 1.0 is assigned the brightness value 128, while those have the range between
1/255 to < 1.0 are reassigned values between 1 to < 128.0, using:
1)127*int( )( rationing
changenormalized
……… (33)
Chapter Three Results and Discussion
76
Accordingly, the ratios from 1-to-255 are assigned values within the range
128- to-255, given by
))2/(128int( )(rationing
changenormalized
………. (34)
It should be noted that all pixels having values 128 present the no-
changed areas, others indicate changes. In fact, this result cannot be adopted
because the ratio 0/0 goes to 0/1 and, thus, representing no-change areas. For
this reason, the analyst must differentiate between change and no-change areas
by utilizing threshold values.
Chapter Three Results and Discussion
77
Figure (3.10, A):The 2017 Thermal Band Over 2010 Thermal Band.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
78
Figure (3.10, B): The 2010 Thermal Band Over 2000 Thermal Band.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
79
Figure (3.10, C): The 2017 Thermal Band Over 2000 Thermal Band.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
80
Figure (3.10, D): The 2015 Thermal Band Over 2000 Thermal Band.
UTM Projection
Geographic Projection
Chapter Three Results and Discussion
81
3.7.3. Principal Components Analysisfor Change Detection
In this stage of research, The Principal Component Analysis (PCA) is
used as a change detector. Although datasets may reflect the measurements
made on one feature, they may often be correlated with other datasets. High
correlation may often exist between adjacent bands, which means that these
bands are not statistically independent. Principal components analysis (also
referred Karhunen- Loeve analysis) is a technique that allows the production
of images where the correlation between them is zero. For the n-dimensional
dataset, n-principal components can be produced. An important advantage of
PCA is that, most of the information within all bands can be compressed into a
much smaller number of bands with little loss of information.
Consequently, this procedure may greatly reduce the computer processing time
in programs implemented for image classification or/and image-change-
detection. Moreover, this linear “PCA” transformation can be used to translate
and rotate data into a new coordinate system that maximizes the variance of the
data. It can also be implemented for enhancing the information content.
However, increasing the use of “PCA” technique is being made in the
remote sensing sciences, especially in reducing the data sets' dimensionality,
[64]. A principal components transform can be visualized most easily in two-
dimensional. The goal is of the PCA to translate and/or rotate the original axes
so that the original brightness values on axes X1, X2 are redistributed onto a
new set of axes or dimensions, X1´, X2´.
3.7.3.1 Mathematical Formulation
The following summarizes the mathematical procedures involved in the
“PCA” two-dimensional transformation, [64]. Consider a set of multi-band
images f(x, y, r), each size M*N and “r” bands.
1. Each image band should be expressed in the form of an n-dimension
vector Dn,1, where n = M*N, given by:
Chapter Three Results and Discussion
82
,1
(1,1,1)
(1,2,1)
.
.
.
( , ,1)
.
.
( , ,1)
n
f
f
D
f i j
f N M















………. (3-5)
2. The whole image bands are, then, arranged into a matrix Dn,r of n-rows and
r-columns, given by:
,
(1,1,1) (1,1,2) . . . . . . (1,1, )
(2,1,1) (2,1,2) . . . . . . (2,1, )
. . ...... .
. . ...... .
( , ,1) ( , ,2) . . . . . . ( , , )
. . ...... .
. . ...... .
. . ...... .
( , ,1) ( , ,2) . . . . . . ( , , )
nr
f f f r
f f f r
Df i j f i j f i j r
f N M f N M f N M r















……… (3-6)
3. The mean of each column in Eq.(3-9) should be computed by taking the
arithmetic averages of each column, given by
,11
1( , , ), where b=1,2,....,r
NM
nb ij
D f i j b
n


………. (3-7)
4. An (n-row, r-column) matrix
,nr
P
is computed by subtracting the mean of
each column
,nb
D
from that column values
,nb
D
. This matrix is called the
Mean-Corrected-Data-Matrix”, its values are given by:
Chapter Three Results and Discussion
83
,
,,
1( ), 1,2,..., and
nib
j b i b
i
P D D j n b =1,2,....,r
………. (3-8)
5. Thecovariance matrix of the
,jb
P
matrix can then be computed by:
{( )( ) }
T
P
C E D D D D

……….. (3-9)
Where E{.} is the expectation operation, Tindicates transposition, and
P
C
is an rr matrix.
6. Now, let us assume thatei and i , i=1,2,3,……n, be the eigenvectors and
corresponding eigenvalues of the matrix
P
C
. For convenience, we shall assume
that the eigenvalues being arranged in decreasing order, i.e. 1 2 3 ….n.
7. A transform matrixA” whose rows are the eigenvectors of
P
C
can should,
then, be computed, given by:
11