Conference Paper

A study on optical and SAR data fusion for extracting flooded area

Capital Normal Univ., Normal
DOI: 10.1109/IGARSS.2007.4423497 Conference: Geoscience and Remote Sensing Symposium, 2007. IGARSS 2007. IEEE International
Source: IEEE Xplore

ABSTRACT This article proposes a method of fusing the optical data before the flood and SAR (synthetic aperture radar) data during the flood. The main goal of this study is to evaluate the capability of data fusion to combine the information from Landsat ETM image with Radarsat SAR image for water body and flooded area extraction. The Landsat ETM data will basically provide the information of landform and background information which includes the normal water extent. The Radarsat SAR data however was taken during flood will provide information mainly on water body extent and flooded area. In the result image, flooded area was significantly enhanced. We can distinguish the flooded area and normal water area easily.

0 Followers
 · 
117 Views
  • Source
    • "In Slovenia, aerophotogrammetry was already successfully used in the case of the Stože landslide and the wet debris flow in Log pod Mangartom (Kosmatin Fras, 2001). Beside aerophotogrammetry, also satellite images are often used to find out areas damaged by floods (Yonghua et al., 2007; Wang et al., 1995; Liu et al., 2002; Frappart et al., 2006) or landslides (Singhroy et al., 1998; Nichol et al., 2006; Yamaguchi et al., 2003; Komac, 2006). LiDAR (Light Detection And Ranging) is getting an important position in observations of natural hazards and disasters. "
    [Show abstract] [Hide abstract]
    ABSTRACT: More and more natural disasters are happening around the world every year: floods, tsunamis, landslides, earthquakes etc. All of them have a common point - they cause large damages and are difficult to be predicted. This paper describes an approach for analysing a flash flood event. The studied area is the Selška Sora River valley in W Slovenia, where a flash flood was caused by the extreme rainfall on 18 September 2007. In the first part of the paper, the determination of flooded areas was done using different data. An extra attention was focused on the satellite images and their applicability to the recognition of the flooded areas. In the second part of the paper, the detected flooded areas were used to compute the selected hydraulic parameters. This study has shown an effective way how to perform a post-flood hydraulic analysis. In the last years, satellite technology has made considerable progress that could significantly improve our ability to assess and predict natural hazards.
    Interpraevent 2012., Grenoble, France; 01/2012
  • Source
    • "But it is sometimes difficult to distinguish the flooded area just from SAR backscattering, because waves caused by wind cause a bright reflection and thus misclassification. The simultaneous use of optical sensors or GIS data has been proposed for reducing misclassification [1] [2] [3]. ALOS/PALSAR, launched in 2006, could be useful for flood monitoring. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We analyzed simultaneous observation data with ground-based synthetic aperture radar (GB-SAR) and airborne SAR (PiSAR) over a flood test site at which a simple house was constructed in a field. The PiSAR σ∘ under flood condition was 0.9 to 3.4 dB higher than that under nonflood condition. GB-SAR gives high spatial resolution as we could identify a single scattering component and a double bounce component from the house. GB-SAR showed that the σ∘ difference between the flooding and nonflooding conditions came from the double bounce scattering. We also confirm that the entropy is a sensitive parameter in the eigenvalue decomposition parameters, if the scattering process is dominated by the double bounce scattering. We conclude that σ∘ and entropy are a good parameter to be used to detect flooding, not only in agricultural and forest regions, but also in urban areas. We also conclude that GB-SAR is a powerful tool to supplement satellite and airborne observation, which has a relatively low spatial resolution.
    EURASIP journal on advances in signal processing 01/2010; DOI:10.1155/2010/560512 · 0.89 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a new fusion rule based on a pulse coupled neural network (PCNN) and the clarity of images is proposed for multi-band synthetic aperture radar (SAR) image fusion. By using a stationary wavelet-based nonsubsampled contourlet transform (SW-NSCT), we can calculate a flexible multiscale, multidirectional, anisotropy and shift-invariant representation of registered SAR images. A weighted fusion rule is performed on the low frequency subbands to calculate the fused lowpass band. For the fusion of high frequency directional subband images, a PCNN model is constructed, where the linking strength of each neuron is determined by the clarity of the decomposed subband images. The fusion approach exploits the advantages of both SW-NSCT in multiscale geometric representations and that of PCNN in the determination of fusion rules; as predicted, the obtained fusion image can preserve much more information regarding textures and edges of the images, compared to its counterparts. Some experiments are performed by comparing the new algorithm with other existing fusion rules and methods. The experimental results show that the proposed fusion approach is effective and can provide better performance in fusing multi-band SAR images than some current methods.
    Signal Processing 12/2009; DOI:10.1016/j.sigpro.2009.04.027 · 2.24 Impact Factor
Show more