Science topic

SAR Interferometry - Science topic

Explore the latest questions and answers in SAR Interferometry, and find SAR Interferometry experts.
Questions related to SAR Interferometry
  • asked a question related to SAR Interferometry
Question
3 answers
Hi, I’m a beginner in satellite image analysis. I want to know the lat/lon coordinates of some bursts of a sentinel-1 image. I looked at the file names of the downloaded zip, but couldn’t find any promising files(attached: file structure). Can someone teach me how I can obtain them?
Context: My purpose is to generate a coherence image and project to QGIS. I used SNAP following up to p12 of this tutorial(https://step.esa.int/docs/tutorials/S1TBX%20TOPSAR%20Interferometry%20with%20Sentinel-1%20Tutorial_v2.pdf). but the coordinates were somehow lost from the first step(importing and choosing bursts so as to produce a split file). not sure why but it apparently happens with other satellites(https://earthenable.wordpress.com/2016/11/21/how-to-export-sar-images-with-geocoding-in-esa-snap/). I was able to produce the coherence without coordinates, so i’m thinking if I can get the coordinates from the sentinel file, I can just add it to the geotiff myself.
I also want to ask, is this idea wrong? are the sentinel coordinates different from the coherence image as it undergoes back geocoding?
Relevant answer
Answer
Maybe you should study the SENTINEL-1 PRODUCT DATA TYPES.
Candidate Reference:
Regards,
  • asked a question related to SAR Interferometry
Question
2 answers
It is a method used to created an interferometric time series developed by Hooper [2004-2007] following the persistent scatterer approach of Ferretti 2001, which produces surface line-of-sight deformation with respect to the multi-temporal radar repeat-passes and mitgate the temporal phase decorrelations due to the instrument errors, DEM error and atmospheric contribution to the phase delay.
SNAP2StaMPS is a Python workflow developed by José Manuel Delgado Blasco and Michael Foumelis collaboration with Prof. A. Hooper to automate the pre-processing of Sentinel-1 SLC data and their preparation for ingestion to StaMPS. Much appreciation goes to the great work of those honorable professors.
StaMPS method follows 8 steps that are carried out on Matlab on a virtual Unix machine or a Linus OS system
If I understood this correctly. Step 3 selects separate groups (each subset contains a pre-determined density of pixels per square kilometer, those pixels have random phase) from the initially selected PS pixels in step 2 that was based on their calculated spatially correlated phase, spatially uncorrelated DEM error that is subtracted from the remaining phase, temporal coherence.
Then, in step 4 (weeding) those groups of pixels per unit kilometers are further filtered and oversampled and in each group, a selected pixel with highest SNR is taken as a reference pixel and the noise for the neighbouring adjacent pixels is calculated, then based on a pre-determined value of (‘weed_standard_deviation’), some of those neighbouring pixels are dropped and others are kept as PS pixels.
A) Am I correct?
B) What is a pixel with random phase?
C) What is the pixel noise? is it related to having multiple elementary scatterers where none of them is dominant therefore their backscattererd signal is recieved at a different collective phase at each aquision even if the ground containing those scatterers were stable over time ?
D) Due to the language barrier, I have read Hooper's 2007 paper , but I couldn't fully understand what the difference between correlated and uncorrelated errors are, and what spatially correlated/uncorrelated errors means.
E) what is difference between by the spatially uncorrelated DEM (look angle) error that is filtered at StaMPS step 3, removed at Step 5, and spatially correlated look angle error that is removed at Stamps Step 7?
There are attached some test results and I would appreciate if someone inform me how I may remove the persistent atmospheric contribution. I have only used the basic APS linear approach using TRAIN toolbox developed by Bekaert 2015
Relevant answer
Answer
I can't answer your question, but I do have to commend you for the most excellent detail, format and structure of any question I have seen on ResearchGate over all the years and thousands of questions I've seen go by!
  • asked a question related to SAR Interferometry
Question
3 answers
Hello researchers,
I have around 100 sentinel-1 SLC images. I want to compute coherence between the successive images (1->2, 2->3… and so on). For this, the conventional method is to read images 1 and 2, select swath, apply orbit files, coregister them using the back-geocoding operator, apply ESD and then compute the coherence and the same procedure is also with images 2 and 3, and so on.
I want to discuss another approach here
  1. Read image 1 and coregister all other images (2,3,4…100) with respect to 1.
  2. Now, for images 1 and 2, we can compute the coherence as the above-mentioned method.
  3. For image 2 slv and image 3 slv, we stack them together considering them already coregistered with image 1 and compute coherence.
I want to understand whether it is an ethical way to do it or not.
1. Can we assume in the case of SAR SLC product image 2 and image 3 are also coregistered when they are coregistered with image 1?
2. Do different incident angles affect the coregistration results?
3. Is there any effect of different range and azimuth pixel size? 4. What should be the optimal method to perform successive time series analysis?
Thanks
Relevant answer
Answer
I would suggest using isce2 stack processing (https://github.com/isce-framework/isce2).
you may find the tutorials at https://github.com/isce-framework/isce2-docs.
  • asked a question related to SAR Interferometry
Question
3 answers
Hello dear Researchers, I need guidance related to derampdemod. working with single SLC_IW data, after applying derampdemod signal encountered with stripes. https://sentinels.copernicus.eu/documents/247904/1653442/Sentinel-1-TOPS-SLC_Deramping.pdf/b041f20f-e820-46b7-a3ed-af36b8eb7fa0?t=1533744826000 1 Anyone who worked with this process is given in the above link, actually, I need to produce figure.2. spectrum after derampdemod. I have attached one intensity image and after derampdemod I encountered stripes. Looking forward to your advice.
Relevant answer
Answer
Roland Akiki Muhammad Amjad Iqbal Hi Roland and Amjad,
Would you please let me know how can I apply de-ramping and re-ramping on S1 data? I have access to GAMMA and ESA SNAP, but couldn't find any module to re-ramp the data—Is there any open-source or freely available matlab/python code for these two processes?
Many thanks,
  • asked a question related to SAR Interferometry
Question
8 answers
So far, I’ve tried to contact as many official distributors of SAR products as I can. One common response that I got from these distributors is, although they're happy to take orders for new acquisitions, they don’t have archived data for my study areas in Ethiopia. Albeit near-real-time satellite acquisitions for submeter resolutions are too expensive, I could manage it. Even though high resolution (submeter) archived data were of high importance for my research, it's unfortunate that I didn’t succeed to get them till now.
To summarize, archived data collected between 2005 and 2010, in VV polarization, are mandatory for my study to conduct interferometric analyzes. So, I wonder if anyone can help me get out of this problem. Any commercially available archived data with the above-mentioned requirements, either in ascending or descending orbit, are welcome.
Relevant answer
Answer
Although I do not think that high-resolution SAR data might be commercially available, I am suggesting you check on the Alaska Satellite Facility (https://asf.alaska.edu/). You might find something suitable for your case!
  • asked a question related to SAR Interferometry
Question
5 answers
Many open source for SAR Interferometry (InSAR) are open to everyone. But is there available persistent and distributed scatterers (PSDS) InSAR software ? I really want to make it available if there is a huge gap.
Relevant answer
Answer
Hi, Dinh!
You may consider using the semi-automated workflow of SNAP-StaMPS. SNAP is used for the pre-processing of SAR generating differential interferograms (DInt) for time-series InSAR processing using StaMPS. StaMPS supports both PS-InSAR and SBAS techniques; however, the current version of SNAP does not support automatic generation of DIint for SBAS processing in StaMPS. Other software packages such as ISCE and Doris, I believe, can do the job.
I hope this helps. Good luck!
  • asked a question related to SAR Interferometry
Question
4 answers
We perform differential SAR Interferometry for analyzing the land movement. In case of Landslides the area generally suffers from decorrelation. So for the confidence in the result what minimum threshold should be taken so that outliers should be removed?
Relevant answer
Answer
Well, it's comprehensively depends on the nature of surface.
  • asked a question related to SAR Interferometry
Question
5 answers
I need to know the computing requirements to do SAR time series analysis and how many SAR rasters (images) should be analyzed?
Relevant answer
Answer
Well, it's depended how you are doing SAR Data processing.
if you are using Sofware like SNAP, you need a powerful processing computer.
As of now Google Earth Engine is suitable alternative if you have programming command.
Number of images required is totally depends on what you are doping through SAR data
  • asked a question related to SAR Interferometry
Question
6 answers
The SAR data shows decorrelation for highly vegetated terrains. Has any of the latest software overcome this shortcoming?
Relevant answer
Answer
Hi Swati,
Decorrelation in highly vegetated terrain is to be expected and IMHO I don't think the choice of the software will have a significant impact on the results but rather the processing method and the properties of the input data.
SNAP by Esa is a freely available software capable of interferommetric processing, Gamma is expensive (very) but has all one can ask, there are also others e. g. Erdas Imagine
  • asked a question related to SAR Interferometry
Question
2 answers
I have the scattering matrix images (8 images: S11_real, S11_imaginary, similarly for S22, S12, S21) and I need to create the coherency matrix images (6 images: Diagonal and upper elements T11,T22,T33,T12,T13,T23). The sensor is mono-static so S12=S21. How can it be done using python\MATLAB. Kindly share the required library/code or equation, required for it.
Relevant answer
Answer
Hi,
You may use the following python code to create T3 from S2 matrix. Since, you have mentioned the sensor is monostatic, considering the reciprocity constraint S12 = S21, the code should produce the required output. Find the attached formulation and python file.
Good luck with polarimetry :-)
# -*- coding: utf-8 -*-
"""
Created on Mon May 3 09:00:21 2021
@author: Narayana
"""
import numpy as np
S11_real = 1
S11_imag = 0
S21_real = 0
S21_imag = 0
S22_real = 1
S22_imag = 0
# Scattering matrix
S2 = np.array([[S11_real+1j*S11_imag,S21_real+1j*S21_imag],
[S21_real+1j*S21_imag,S22_real+1j*S22_imag]])
# Kp- 3-D Pauli feature vector
Kp = np.expand_dims(np.array([S2[0,0]+S2[1,1], S2[0,0]-S2[1,1], S2[1,0]]),axis=1)
# 3x3 Pauli Coherency Matrix
T3 = np.matmul(Kp,np.conj(Kp).T)
  • asked a question related to SAR Interferometry
Question
3 answers
I'm looking for a method to find the Doppler center of complex SAR image. Can anyone suggest the easiest way to estimate the Sentinel image Doppler center ?
Relevant answer
Answer
The Doppler center (line) depends on the S/C position, velocity in the Rotating Earth-fixed (ECEF) Coordinate system. So formally you would need orbital information. Luckily the orbital tube Sentinel-1 is controlled accurately, and provides a quite stable S/C position and velocity as function of Sentinel-1 Ascending Node (ECEF) crossing (time). This crossing time oscillate with magnitude of 0.3 sec and precesses some +7 seconds per year. So a previous estimate of the Doppler center for a particular UTC time (N times 10 days apart) might help to obtain a guestimate for the next Doppler line at UTC (0) + N x 10.0 days + N x 10/365.25 x 7 sec
  • asked a question related to SAR Interferometry
Question
5 answers
Hello
I'm working on SAR Images Co-registration.I want to divide my image into two looks with the spectral diversity method. But I did not find its complete formulas in any article.
Does anyone know how I can do this? Or do you know the paper that fully explains the algorithm?
Relevant answer
Answer
Thank you for your answer.
Is it necessary to center the images to the Doppler centroid before using the Fourier transform?
  • asked a question related to SAR Interferometry
Question
4 answers
Without using the zip file with metadata, is there any way to use just the Geotiff for flood detection? (using Snappy or any other API) Any insight will be highly appreciated.
Relevant answer
Answer
If you are not interested in working with the raw ZIP file of the Sentinel data, open the image in the SNAP software and save it in GeoTIFF format again. By doing so, you will be able to generate a new GeoTIFF file with its metadata appended to it. Later you can use the same single GeoTIFF file for your analysis.
I have already added an answer previously on how to use the Snappy API, Graph Builder and GPT tool along with some references for flood detection.
I hope it helps. Thanks!
  • asked a question related to SAR Interferometry
Question
11 answers
I want to know whether Sentilnel data can be processed using matlab. My area of interest is SAR Interferometry.
  • asked a question related to SAR Interferometry
Question
4 answers
When calculating interferometric coherence, why can't you do so on a pixel by pixel basis? I know the equation for estimating coherence = (|S1∙S2* |)/√(S1∙S1*∙S2∙S2*) where S1 and S2 are the two single look complexes. And I know this calculation uses a maximum likelihood estimator but why do you need to specify an estimation window and why cant the estimation window size be 1?
Thank you.
Relevant answer
Answer
You are absolutely right. Compared to optical remote sensing where 'adjacency' is a high order effect the signal impact of 'neighbors' is much higher here. A perfect coherence estimator would include dipol distribution, local 3D geometry for ray tracing and radiosity estimation and a gaussian shaped weighting window to 'reflect' the mixing and superimposition of the representing physical process. I was often thinking to compile a paper about all this in connection with a multi stage, alternative phase unwrapping.... Hope it helps. If not you can send me an email: rogass@gfz-potsdam.de
  • asked a question related to SAR Interferometry
Question
4 answers
i am looking for a book to learn how can i design a repeat pass interferometric sar system considering necessary system parameters and simulating them in matlab?
Relevant answer
Answer
thanks for your favor
  • asked a question related to SAR Interferometry
Question
1 answer
In interferometric water cloud model we estimate 3 unknown parameters via non-linear least square regression, but if we do not mention proper initial values of those parameters, regression process doesn't complete. I am trying to find how to assign the initial parameters in a correct way.
Relevant answer
Answer
The water cloud model has been around for some time, so you could find from literature a few examples of values for the three parameters, preferably from similar land cover or objects to what you are studying. The more realistic the initial values, the faster the process will converge to a solution.
  • asked a question related to SAR Interferometry
Question
1 answer
I have written a Sentinel-1 Level-0 image formation tool in Matlab. I am able to extract and decode all the information in the Level-0 file both for the image formation and for the spacecraft position. This tool is able to extract each burst and process it into a complex image chip. I am currently unable to determine how to determine an exact antenna beam center ground/pointing location (which should then allow me to map formed complex pixels to lat/lon). My current method/algorithm is biased in both lat (~0.3 degrees) and lon (~0.1 degrees) at a target latitude of ~45N. I have assumed that the velocity vector defines the x axis, the ECEF normal to the spacecraft defines the z axis and the cross product is the y axis. I also assume the quaternions provide how to rotate the antenna look direction from the y axis to earth (though I am not confident in my application of the rotational values...). I further assume a WGS-84 earth model and an imaging location at sea level and find where the rotated y-axis intersects the WGS-84 model. But, as stated, this yields a significant earth surface error relative to the positioning provided in the available SLC Level-1 product. I'm very open to exchanging information and engaging in discussion to better understand this less-documentation-than-desired aspect of the algorithm.
Relevant answer
Answer
  • asked a question related to SAR Interferometry
Question
6 answers
I have written an image formation algorithm for Sentinel-1 Level-0 IW data. To properly form the image, one *must* perform the frequency unwrapping on the azimuth data. I have found I am able to perform an azimuth compression on this data without needing to get to the documented algorithmic step of doing the second unwrap.
How and why, exactly, does Sentinel-1 have this frequency wrapping of azimuth data? What benefit is there to the documented additional processing steps that lead to the documented second unwrapping?
Relevant answer
Answer
I like pictures. Here is an example using real Sentinel-1 data taken right after range processing.
The upper left image is the time-frequency representation of a single azimuth column of data. Here frequency is the y axis (not correct numbers) and down is positive, time is the x axis right to left (also not correct numbers) increasing. The equivalent time domain amplitude and frequency are plotted directly below it. Notice how, in time-frequency, there are several bands of data running from upper left to lower right - it is this wrapping that UFR looks to undo.
The rest of the UFR processing steps down through the middle. First (top) the frequency replication, followed by the deramp (middle) and the LPF (bottom). Reramping is in the upper right with the time domain magnitude and frequency below it.
I understand how to do the UFR processing, but I am still unsure how or why the system induces this artifact in the azimuth domain - particularly after the range processing (shifting and compressing). I'm still open to a detailed explanation of how this happens.
  • asked a question related to SAR Interferometry
Question
3 answers
The following data comes directly from S1B_IW_SLC__1SDV_20180426T063818_20180426T063845_010651_013707_1395.SAFE\annotation\s1b-iw1-slc-vv-20180426t063819-20180426t063844-010651-013707-004.xml:
-<attitude>
<time>2018-04-26T06:38:19.750001</time>
<frame>GM2000</frame>
<q0>2.048773e-01</q0>
<q1>3.932928e-01</q1>
<q2>3.003435e-01</q2>
<q3>8.444761e-01</q3>
<wx>-5.878133015357889e-05</wx>
<wy>-9.468332864344120e-04</wy>
<wz>-4.792133113369346e-04</wz>
<roll>-6.302749733477862e+00</roll>
<pitch>-5.238290812737592e+01</pitch>
<yaw>4.225907254185638e+01</yaw>
</attitude>
I can not find *any* set of equations to convert from the provided quaternions (q0-q3) to pitch,roll,yaw and vice-versa. Yes, I have started with all of the ESA published documents and then expanded out to multiple other documents on the web. I have coded every variant I have encountered in Matlab, but am unable to get numbers that agree to even to the ones digit. I have even tried to use different application sequences (instead of just the published 3-2-1 order).
Worse from a documentation point of view, I believe that here Q3 is the magnitude, but almost all Sentinel-1 documentation defines Q0 to be the magnitude of the quaternion. I get that it is generally not consistent and/or both are acceptable - BUT a given system should be self-consistent.
As you might expect, this is a key step to being able to associate each formed image pixel with an exact ground coordinate. Without an understanding of how to apply (q0-q3) to be pitch,roll,yaw it is clear any attempt I take to use them for vehicle rotation/orientation will be wrong. "Close" isn't good enough - I need the full, exact understanding so I can precisely apply the vehicle orientation to my image formation.
Please, don't just point me to Wikipedia or some web page - try the numbers first - I have.
Relevant answer
Answer
Hi John,
If the Sentinel-1's quaternion is interpreted as q0i + q1j + q2k i + q3, where q3 = 0.8444761 is the scalar part, then by choosing the YXZ sequence of rotation axes for Tait–Bryan angles, the equivalent Euler angles (degrees) are shown on the above figure.
  • asked a question related to SAR Interferometry
Question
2 answers
Hi
Im working with Sarscape, and during the "geocodding and radiometric calibration" process I encounter the message "Memory not found"! but there is no problem and limitations in the CPU, RAM and the Hard disk free space. The error report file is attached.
What might be the reason for the problem?
Relevant answer
Answer
Are you connected over remote desktop? If so it is a GPU issue. Use a VNC instead of remote desktop. Alternatively, check the size of your virtual memory and paging file. Use preferences in SARScape to change the temp directory to somewhere with plenty of space.
  • asked a question related to SAR Interferometry
Question
6 answers
i want to make interferogram using alos palsar 2 but the data i got are alos palsar 2 imagery level 1,5 and tutorial sarscape that i learnt using alos palsar 2 level 1.1 and if can where i can find the tutorial?
Relevant answer
Answer
yes, but teh best is to get your image already focused (L1.1). You can focus them youself, but it will never be as good as have them already focused by JAXA or other.
  • asked a question related to SAR Interferometry
Question
5 answers
I should find free SAR images ( Spaceborne or Airborne).
Especially I need complex SAR images and also StereoSAR images for same area.
So I can examine both interferometric and radargrametric algorithms' results.
Thanks.
Relevant answer
Answer
Please go through this website https://vertex.daac.asf.alaska.edu/# . SAR Sensors data are available in this website.
  • asked a question related to SAR Interferometry
Question
12 answers
Hello all,
I am currently working with the SLC data (Level 1.1) acquired using ALOS PALSAR sensor. But I am not able to display the Image using MATLAB. Can someone please tell me what are important parameters that I have to look into while processing SLC data, as I wish to obtain SPAN of the SLC data?
Relevant answer
Answer
Hi
I think you cannot directly read the ALOS PALSAR SLC image in matlab. As suggested by Wensheng Wang, you can process the data and export the data into [S2] matrix. The output from PolSARpro is .bin file. You can open this SLC ([S2] matrix) in matlab using "multibandread" function and display the images using "imagesc" function .
Hope this would help you solve your problem
Musthafa
  • asked a question related to SAR Interferometry
Question
1 answer
Kindly suggest me a step by step procedure (cheatsheet or crib sheet) for SAR image analysis (purticularly, creating interferogram fringes).
Also suggest me which open source SAR image analysis tool is compatible?
Relevant answer
Answer
1. Download ESA SNAP from here: http://step.esa.int/main/toolboxes/snap/ and install it
2.  Import your data over File > Import > SAR sensors. You need two images of the same track (relative orbit) for interferometry.
3. If your are using Sentinel-1, ERS or ENVISAT: Apply Orbit file (Radar > Apply Orbit File)
4. Coregister your image pair (Radar > Coregistration)
5. Create the interferogram (Radar > Interferometry > Products > Interferogram)
These are the steps which work in general, but maybe they need to be adjusted according to the used sensor.
Tutorial for Sentinel-1 IW SLC data
  • asked a question related to SAR Interferometry
Question
4 answers
Can anyone please help me. I don't know what the Problem is: I downloaded Sentinel 1 data (Level 1 S SLC) and opened them in the sentinel 1 toolbox. But when I add them in the window for the InSAR coregistration, I don't have any layers to select for the master image and the toolbox always reports the problem: "Operator 'CreateInSARStackOp': Value for 'Slave Bands' is invalid". What is the Problem? Do I have to preprocess the SLC's?
I allready debursted them, but that also doesn't help.
Thank you for your advices!
Relevant answer
Answer
For using Sentinel data in SNAP software (i recommend), you need some pre-processing steps like burst them or you can use Tools- graph builder-radar-Insar graphs- topsar coreg. int. IW all swaths..
That should work.
  • asked a question related to SAR Interferometry
Question
3 answers
Hello everyone,
I have been trying to implement the Wavenumber Domain Algorithm (WDA) to process simulated FMCW-SAR data. For those who have open access to IEEE website, the paper describing this very algorithm can be found in the link on the bottom. The algorithm is slightly different from the common WDA for pulse SAR: the main difference is given by the Stolt interpolation which now takes into account the movement of the sensor during the sweep.
This is my first time trying to focus SAR data, thus I am having a bit of hard time. I was wondering whether there are out there experienced SAR processing guys, which are maybe familiar with FMCW-SAR too. If you are out there .. well, just give a look at my Matlab code (see attachment).
The FFT-based spectrum of the signal was compared to the ones achieved via Principle of Stationary Phase (PoSP). It looks like there's good matching until the Stolt interpolation. I am not at all convinced of my Stolt interpolation and I would like to ask two things:
  1. How to define the vector containing the mapped range frequencies? See the block "RE-MAPPING RANGE FREQUENCIES" in section "STOLT INTERPOLATION" of my Matlab code. What I did .. I took into consideration the biggest mapped range frequency, and I used it to define a new sampling frequency. This new sampling frequency was then used to define the vector of the mapped range frequencies. See code rows 219 - 234.
  2. The unambiguous range interval in FMCW-SAR is directly proportional to the sampling frequency. Unfortunately, when applying 1., the slant-range extension is no longer equal to the unambiguous range. Compare 'Ru' to 'c*tp(end)/2'. How so?
Overall .. I would say there's something quite wrong with my Stolt interpolation. Is there anyone capable of helping me out?
Regards,
Emiliano
Update: Matlab attachment was eliminated, see new attachment in the response below.
Relevant answer
Answer
Hello Antonio,
Thanks for your time and sorry for the late replay (holidays). Also, sorry for the missing file: my mistake. See the new attachment where you'll find the missing file and a new main ('WDA.m'). Forget about point 2, it did not make much sense my question. Regarding point 1 ..
I get your point: both azimuth and range frequency affect the value of the "new" range frequency ('f1' in the paper). My question is different, though: the "new" range frequency vector changes along the azimuth frequency. See the data support in Fig. 11: the "new" range frequency vector is azimuth dependent and defines a sort of arc-shaped support. Am I correct to assume that there is no use in computing the value of the spectrum for the afore-mentioned domain since the 2D-IFT would fail with such a support? What I did (see rows 298 - 300), I took into consideration the vector at a specific azimuth frequency and I replicated it for the remaining ones in order to achieve a rectangular data support: see Fig. 12. Then, the spectrum was computed for the points of this new grid and the signal was finally transformed via a 2D-IFT.
Some question arise, hope you can help me out ..
- Are my considerations (see above) correct?
- If yes, would you do something different regarding rows 298 - 300: how would you define the "new" range frequency vector to define a rectangular support?
Let me know your thoughts.
Thanks for your time,
Emiliano
  • asked a question related to SAR Interferometry
Question
4 answers
Interferogram generation is done on a TanDEM-X pairs in CoSSC format. Can I now calculate the phase standard deviation and make assumptions how accurate the interferogram is over a specific (distributed) target? Additionally, how does multilooking of the interferogram and different coherence windows affect this accuracy?
Relevant answer
Answer
Hello,
Tough question indeed, and to be honest I do not have exact answer to these question. And I do not have any experience with TANDEM-X either. But I have some experience with coherence. Basically, if coherence is quite low, generated fringe is in doubt. So, man made structure should have higher coherence. But, in many case of mine, especially when I have observed in rural housing, the coherence over houses may be persistently low. So I guess, you should consider on what target is being investigated.
Just a thought..
Cheers, Bambang
  • asked a question related to SAR Interferometry
Question
3 answers
I am planning to register missile borne SAR images and optical pictures and want to know the differences between missle-borne and air-borne SAR images
Relevant answer
Answer
SAR cares about imaging geometry, and not the vehicle/platform on which the radar is located...  Once you have an image, you can't tell what the vehicle was from which the data was collected...  So, no difference...
  • asked a question related to SAR Interferometry
Question
7 answers
I want to test my code to register SAR images to optical image and need a database.
Can anyone help?
Relevant answer
Answer
Some interesting complex SAR images may be found at
  • asked a question related to SAR Interferometry
Question
3 answers
as mentioned in Fialko 2001 for 3 D displacement from InSAR, a linear system of equations should be solved for each pixel :
1.Two equations from Ascending and Descending Orbits
  [Un Sin φ – Ue Cos φ] Sin θ + Uu Cos θ + δlos = dr ……….. (1)
Where,
  φ … The azimuth of the satellite heading vector (positive
           clockwise from the North)
  θ … The radar incidence angle at the reflection point
  dr … The LOS displacement at the reflection point
  δlos … Is the relative measurement error
2.
One equation for the Azimuthal Offset from the descending orbit
  Un Cos φ + Ue Sin φ + δazo = dazo ………………….……….. (2)
my question is how to obtain dazo from an unwrapped differential interferogram.
Relevant answer
Answer
Have a look into the paper of Joughin et al. 1998 I have attached. Though it explains how to tetermine the 3D flow vector of an ice-flow by using ERS-interferometry, the principle should be the same as in your case.
  • asked a question related to SAR Interferometry
Question
5 answers
To resolve 3 D Surface displacement from DInSAR it is required to know the incidence angle at each target pixel and the azimuth of the satellite heading vector clock wise positive from north to form three equations in three unknowns(Un, Ue, and Uv) for each pixel.
The azimuth vector differs according to the looking direction of the satellite and I use ERS satellites and ENVISAT 
Relevant answer
Answer
Mathijs answer is very good, except ERS/Envisat goes west(!) at high latitudes.  At equator, the ascending groundtrack azimuth is: 450deg -98.54deg (orbit inclination)  -3.93deg (earth rotation) =  347.53deg.  Descending at equator is: 90deg +98.54deg +3.93deg =192.47deg.  Azimuth at other latitudes may be found by spherical trigonometry, being 270deg.(west) at high latitudes.
Kjell
  • asked a question related to SAR Interferometry
Question
6 answers
I am working on SAR interferometry with some TSX Stripmap mode images. I have some choices for "Processor Gain Attenuation" which are 0, 10 and 20dB but I totally have no idea about this value.
Could you let me know some suggestions?
Could you tell me which software is best for TSX interferometry processing? Thank you! 
Xuan
  • asked a question related to SAR Interferometry
Question
6 answers
After forming an interferogram and correct it for phase and apply a DEM to remove the topographic noise now it can be called "DInSAR" which is used for crustal displacement
How to create displacement map from this DInSAR????
Relevant answer
Answer
Hi Reda. Once you have unwrapped slant range phase values. It needs to be converted into slant range metric units using displacement = lambda*phi/ 4*pi. where phi is the slant range displacement phase which you derived using phase unwrapping. pi is 3.14. lambda is the wavelength of the SAR signal, like for X, its 3.1 cm, C its 5.6 cm and so on. If you apply this formula, you will get the slant range motion in metric units.
If you want to convert the slant range into absolute up/down motion (as you are looking for crustal uplift/subsidence), then you should consider the local incidence angle of the scene. Local incidence angle constitutes of the global incidence angle of the satellite, while acquiring your scene, and the slope of the terrain. Be cautious. 
You need to ultimately tie your final product with some stable points where you consider no motion or the points where you know some uplift/subsidence information from GPS or other field survey methods. This is necessary for accuracy assessment. 
This in total will you give you the product.
Regards.
Saurabh
  • asked a question related to SAR Interferometry
Question
2 answers
I need to generate a DInSAR to study the surface displacement
In NEST, It is required to create first two pairs:
1. TOPO pair 
2. DEFO pair
The information mentioned in the user manual is not enough
I need help for this steps in the software 
  • asked a question related to SAR Interferometry
Question
3 answers
I am currently working on an HPC-Cluster with 16 available threads. So I put the command export OMP_NUM_THREADS=16 at the top of my bash script, hoping that the number of threads is set for all following GAMMA-functions that use OPENMP. Unfortunately this is not working (e.g. with running offset_pwr), as still the default number of 4 threads is used. Does anybody have any suggestions?
Relevant answer
Answer
Dear Falah,
Thank you for your quick answer. Multithreading is working now. After contacting GAMMA we got to know that multithreading with more than 4 threads is not available anymore for  algorithms where a higher number of threads gives no significant improvement of performance. As offset_pwr is such an algorithm and we tested multithreading mostly with this function, we (wrongly) assumed that multithreading is not working in general in our case.
  • asked a question related to SAR Interferometry
Question
3 answers
How to go about the topic as I am very new to this topic.
Relevant answer
Answer
Dear Chandra,
there are several methods that have been developed with specific SAR systems (interferometric SAR in particular), but 10 years ago we have shown that a "standard SAR system" (such as the ones on ERS, RadarSat & Envisat) could be used via the Doppler centroid (Chapron et al. JGR 2005, this was already proposed by van der Kooij for ERS data a few years before). This is what is used also in the Kang et al. document just sent by A.R. Karbassi. This Doppler centroid velocity will now be routinely computed for the Sentinel 1A/B satellites. There are issues in correcting for wind and wave effects in there (as well as in inSAR data), and Alexis Mouche has been working on these specific issues.
Other than that, you may use the refraction of swells over currents to invert current shears (see papers by Grodsky et al:
or the variation in ocean roughness and its relation to currents (see Kudryavtsev et al. JGR 2005, 2012)
However, the most "popular" method (as defined by invested $$) is to use a cross-track interferometric SAR to measure sea surface height and try to invert from that the currents through a quasi-geostrophic approximation... this is the basis of planned satellite missions SWOT and COMPIRA. You can look up the JPL web site for details of the method. It is probably not the most efficient but SWOT is not just designed to measure currents, it is mainly meant to measure river and lake levels.
  • asked a question related to SAR Interferometry
Question
4 answers
Hey there. I'm having trouble processing some full-polarimetric SLC Radarsat-2 imagery.
On NEST DAT 4C, on "SAR Tools | Multilooking", I have two options of multilooking: (1) is GR Square Pixel and (2) is independent looks.
My image has a pixel size of ~ 11m on range and ~5 on azimuth. When I select GR Square Pixel and configure 1 look on range direction, NEST suggests me 3 looks in azimuth; if I select 2 looks in range, NEST suggests 7 looks in azimuth.
I'm not understanding those multillooks. They are not square pixels. Is there any difference between "Ground Range multi-looking" and "normal" multilooking?
Relevant answer
Answer
A quick intruduction to SAR imaging terminology can be found here: