Content uploaded by Armin Walter Doerry
Author content
All content in this area was uploaded by Armin Walter Doerry on May 06, 2015
Content may be subject to copyright.
I
n the early 1950s, researchers dis-
covered that an airborne side-look-
ing radar’s antenna beam could
be artificially narrowed to improve its
angular resolution by use of the Doppler
characteristics of the radar echoes. To
28
Optics & Photonics News
■
November 2004
Synthetic Aperture Radar
Armin W. Doerry and Fred M. Dickey
Optics and synthetic aperture radar (SAR). At first glance
one might question what the two technologies have in common;
in reality they share an intertwined history that dates from the
earliest coherent radar imaging effort.
1
1047-6938/04/11/0028/6-$0015.00 © Optical Society of America
achieve this effect, the corresponding
antenna aperture was synthesized by
summing multiple returns to create a
much narrower beam than that of the
real antenna carried by the aircraft.
Significant technical challenges were
rapidly overcome, allowing practical
operational systems to be flown as early
as 1958. Since then, the pace of develop-
ment in the field has not slowed; subse-
quent work has generated ever more
sophisticated synthetic aperture radar
(SAR) systems that today offer incredi-
bly detailed images with all the atten-
dant advantages of a microwave radar
system, including the ability to image
at night and through clouds, fog, dust,
adverse weather and, in special circum-
stances, foliage and the ground itself.
November 2004
■
Optics & Photonics News
29
technological advances in components
and algorithms have allowed a leap in
its utility and desirability as a remote
sensing instrument, so that today SAR
often rivals electro-optical/infrared
(EO/IR) systems.
The fundamentals of SAR
A SAR image such as that illustrated in
Fig. 1 is usually a two-dimensional (2D)
map of the radar reflectivity of a target
scene which includes dimensions of
range and azimuth. Most airborne and
orbital SAR systems are monostatic, in
that they employ a single antenna for
transmission and reception of the radar
signal. The transmitted signal is typically
a sequence of modulated pulses gener-
ated at various positions along the radar’s
flight path. Ranging is accomplished in
the usual manner for radar, via pulse
echo timing. SAR is unique in that echo
data from the different positions, also
sequential in time, are processed as a
collection to artificially lengthen the
antenna array to the spatial extent of the
collection, or in other words, to the syn-
thetic aperture length. The technique
narrows the array beam pattern and
makes it possible to achieve finer
azimuth resolution. This type of coher-
ent processing across multiple pulses is
often called Doppler processing.
In SAR the essential measurement is
a record of the pulse echoes at various
positions along the flight path, where
specific echo time delays correspond to
round-trip ranges via the propagation
velocity. Recognition of the fact that
the same delay is achieved for a one-
way range at half the propagation veloc-
ity (something the seismic imaging
SAR systems have been successfully
operated from raised platforms, manned
and unmanned aircraft of all types,
spacecraft orbiting Earth and other plan-
ets, and even from Earth to image the
moon and other planets. The nature of
SAR images also facilitates a number
of other useful products, such as high-
fidelity topographic maps and sensitive
change detection maps. SAR processing
embodies, in a single technology, the
principles of holography, tomography,
optics and linear filtering. Engineers have
successfully fielded systems that operate
at meter to millimeter wavelengths.
Systems that operate at optical wave-
lengths are now under development.
Each type of system has its own advan-
tages and disadvantages.
Although the concept of SAR is
more than 50 years old, relatively recent
Image of
Washington, D.C.,
created by Sandia
National Laboratories
radar system.
Image of
Washington, D.C.,
created by Sandia
National Laboratories
radar system.
Tell us what you think: http://www.osa-opn.org/survey.cfm
community terms “the exploding reflec-
tor” model) allows a meaningful illustra-
tion of aperture synthesis such as that
shown in Fig. 2.
In optics, an imaging lens applies a
phase function to a scattered field so that
coherent summation occurs at the cor-
rect location in the image plane. If, how-
ever, the field itself can be sampled with
both magnitude and phase, then the
focusing operation of the lens can be
applied with signal processing rather
than by the dielectric properties of the
lens. As can be seen in Fig. 2, any arc of
samples across the aperture would suf-
fice, with no restrictions on the linearity
or curvature of the arc. If the target scene
is static, then clearly the field measure-
ments need not be simultaneous, or
even collected in any particular order.
Sampling, in which the pulse echo data
is collected during the course of the air-
craft’s flight, is inherent to a pulsed radar
system. The Doppler signatures of objects
in the target scene are manifested in their
pulse-to-pulse phase variations. In an
analogy with the field of lens design, it is
however essential that the spatial loca-
tions of the samples—or at least their
precise positions relative to each other—
be known to a fraction of a wavelength.
Modern inertial measurement sys-
tems, especially when aided by Global
Positioning Satellite (GPS) information,
can often measure relative radar location
to within centimeters per second of syn-
thetic aperture collection time, with
submillimeter random position error.
Excessive motion measurement errors,
which result in smeared or blurred
images, can often be remediated by
means of autofocus signal processing
techniques. A popular SAR autofocus
algorithm with roots in astronomy is
known as the phase-gradient autofocus
algorithm.
2
In any case, whether auto-
focus is used or not, if the residual phase
errors in the compensated data set are
less than a fraction of a wavelength,
then the image will exhibit resolution
approaching the desired diffraction limit
of the synthetic aperture. Good radar
designs often achieve what is, in effect,
diffraction-limited imaging.
Strictly speaking, SAR entails synthe-
sizing a longer antenna aperture to the
end of achieving finer azimuth angular
resolution. The azimuth resolution is
limited only by the length of the synthetic
aperture, not by the size of the antenna
carried by the aircraft. However, a con-
straint on the real (physical) antenna
remains: to be capable of keeping the
scene of interest within the antenna
beam footprint. Appropriate synthetic
aperture lengths, which are commonly
from several meters to tens of kilometers,
are calculated from range, resolution
and wavelength.
SAR images are more appealing in
aesthetic terms when the range resolution
is commensurate with that of the finer
azimuth resolution. Finer range resolu-
tion is achieved by sending a pulse of
adequate bandwidth; this can be done
either by sending a suitably short pulse
or by modulating a pulse so as to yield a
narrow autocorrelation function similar
to that which characterizes spread-
spectrum communications. Popular
modulation schemes include random
phase codes and the linear-frequency-
modulated (LFM) chirp signal. Modern
SAR systems typically employ pulses that
range from several microseconds to sev-
eral hundred microseconds in length,
with time-bandwidth products that are
sometimes in the tens of thousands. The
LFM chirp signal is particularly advanta-
geous for fine resolution SAR systems in
that it can be easily generated; another
advantage is that it can be partially pro-
cessed before the data is digitized. Prior
to sampling, the chirp can effectively be
removed from the echo signals via het-
erodyning. The resulting video signal has
reduced bandwidth, in which a constant
frequency maps to a constant relative
delay (range).
The collected data set represents a sec-
tion of a surface in the Fourier space of
the target scene being imaged, as illus-
trated in Fig. 3. Since the raw SAR data
consist of samples of the Fourier space
of the target scene, it is only natural to
employ Fourier transform techniques to
process them into an image. Because the
30
Optics & Photonics News
■
November 2004
SYNTHETIC APERTURE RADAR
Figure 1. SAR image of a location at Kirtland Air Force Base, Albuquerque, N.M., exhibiting
4-inch (10 centimeter) resolution. Note that the aircraft are better defined by their shadows
than by their direct echo return.
November 2004
■
Optics & Photonics News
31
data are collected on a polar raster, they
often have to be reformatted or resam-
pled before digital signal processing can
take place efficiently. A popular technique
for processing raw SAR data is the polar
format algorithm.
“Spotlighting” and “strip-mapping”
are the two principal operating modes for
SAR systems. In spotlight SAR, the radar
dwells on a single scene for one or more
synthetic apertures, with the image width
confined to the antenna azimuth beam
footprint. Originally, the term strip-map
SAR was used to describe cases in which
the radar was used to scan during a syn-
thetic aperture, forming an arbitrarily
long image from multiple overlapping
synthetic apertures along a flight path
that was generally much longer than the
azimuth footprint of the antenna beam.
Modern SAR systems often form strip
maps by mosaicking a sequence of spot-
light images.
A number of subtleties in the charac-
teristics of the data have been ignored in
this discussion; some of them become
problematic as resolutions become finer
or scene sizes increase. Various processing
algorithms have been developed to
accommodate these characteristics and
mitigate their effects; each has its propo-
nents for different applications. All share
the objective of creating an image from
field measurements along a synthetic
aperture over a finite bandwidth.
SAR optical processing
From the very early days, synthetic aper-
ture radar imaging and optics have been
closely linked.
3-5
Fourier optics provided
the necessary signal processing technol-
ogy for what might be considered the
first successful SAR systems. The very
successful application of optical signal
processing to synthetic aperture radar
development was also a big stimulus for
interest in, and development of, Fourier
optics and optical signal processing. The
equivalence between SAR image forma-
tion and holography also played an
important role in the development of the
technology. Harger
3
states in his book,
“The elegant ideas of [Dennis] Gabor
and [Emmett] Leith are not essential but
[are] very instructive in understanding
and generalizing the synthetic aperture
radar principle.” It would be very difficult
to treat in significant detail the fruitful
interaction between the radar and optics
community in these few pages, but by the
same token it would be difficult to over-
state the role that optics played in the
development of SAR technology.
Early research culminated in 1953 in
a much larger effort known as Project
WOLVERINE,
6
coordinated by the
University of Michigan. SAR technology
developed rapidly from this point. Early
in development, it was recognized that
there was a data storage problem: the
electronics of the day did not offer a
practicable storage method for the data
rates generated by SAR systems. A clever
solution was to use photographic film as
a storage medium. High-resolution pho-
tographic film offered data storage densi-
ties on the order of 1,000 line pairs per
millimeter. Film recording was used
successfully in airborne SAR systems as
early as 1957.
6
The electronics of the day also did not
measure up to the data processing prob-
lem; computers at that time were not up
to the task. Although there were attempts
at electronic analog data processing,
Fourier optical processing of the film-
recorded data was recognized almost
immediately as a very viable solution
to the 2D signal processing problem.
Figure 4 illustrates the components of
a successful optical processing system.
A key component, the conical lens, is
another of the many contributions of
Leith.
3
In this system, each range line of
data is written across the film as a data
line and high-resolution SAR images are
obtained as the output recording film
tracks the motion of the input film
through the system. This system was
SYNTHETIC APERTURE RADAR
Figure 2. SAR processing samples the scattered field and applies the imaging functions
depicted to the right of the aircraft flight path.
Figure 3. SAR data represent a surface in the 3D Fourier space of the target scene.
z
y
x
Tell us what you think: http://www.osa-opn.org/survey.cfm
eventually replaced by the more flexible,
tilted-plane optical processor, which con-
sisted of a cylindrical telescope, a spheri-
cal telescope and tilted input and output
planes. The tilt of the planes could be
changed to accommodate changes in
SAR system parameters.
Eventually, advances in electronics
and computing technology made optical
processing of SAR data obsolete. But the
epoch of optical processing was not as
short as people relatively new to the
fields of optics or radar might think.
Optical processing was the major
method of producing high-resolution
images from the late 1950s (before the
advent of the laser
7
) until the 1980s. In
his 1980 paper, Dale Ausherman
5
states,
“While current operational SAR systems
almost unanimously employ coherent
optical methods, proponents of new sys-
tems stress the need for digital technolo-
gies in order to overcome the apparent
limitations of optical approaches.” It
can be argued that it was the success of
coherent optical processing that fueled
the relatively large research efforts that
followed in areas of optical data process-
ing and optical pattern recognition.
There are other ties between the optics
and SAR communities. One example is
the aforementioned autofocus algorithm,
which evolved from techniques used in
optical astronomy. Currently there is
interest in developing SAR type systems
at optical wavelengths.
Modern systems and
applications
As digital computers became more pow-
erful, optical processing techniques were
supplanted by digital signal processing
techniques which offered greater flexibil-
ity and more processing options. Later,
when computer hardware shrank in size
and weight, image formation processing
left the laboratories and became an inte-
gral part of the radar, a move which
offered the user real-time images as well
as multiple and flexible operating modes.
Today, a variety of systems fill impor-
tant roles in surveillance, reconnaissance,
mapping, monitoring and resource man-
agement. SAR systems are used by the
military, government agencies and com-
mercial entities.
One operational high-performance
airborne system is the Lynx (AN/APY-8)
SAR, designed by Sandia National
Laboratories and produced by General
Atomics. It offers a variety of imaging
modes and is capable of forming high-
quality real-time 4-inch resolution
images at 25 kilometer range (represent-
ing better than arc-second resolution)
in clouds, smoke and rain. This system,
which weighs 120 pounds, has flown
on a variety of unmanned and manned
aircraft, including helicopters. Sandia
National Laboratories is now engaged
in developing a MiniSAR system which
offers the same flexibility and high-
quality images and resolutions at a
somewhat reduced range, but weighs
less than 20 pounds.
In 1978, SeaSat became the first Earth-
orbiting SAR launched by NASA. Since
then, a plethora of orbital SARs have
been (and continue to be) flown by sev-
eral nations. One of them is the Shuttle
Imaging Radar flown by NASA. In 1990,
the Magellan SAR began orbiting and
mapping the cloud-enveloped planet of
Venus, offering observers unprecedented
views of its shrouded surface.
Normally in imaging only the magni-
tude of image pixels is displayed as the
image. However, SAR images formed
by digital computers retain their phase
information. This factor can be exploited
to display a number of interesting scene
characteristics.
Interferometric SAR (IFSAR or
InSAR) is a technique in which images
are formed from two synthetic apertures
taken at slightly different radar antenna
elevation angles. The images exhibit a
very subtle parallax which is observable
as a phase difference that depends on tar-
get height. The phase difference is mea-
sured on a pixel by pixel basis to ascertain
the surface topography of the scene; this
procedure allows 3D surface maps with
unprecedented detail and accuracy to
be composed. Sandia’s Rapid Terrain
Visualization project demonstrated an
IFSAR with on-board processing capable
of forming topographic maps with
3-meter post spacing and 0.8-meter
height accuracy. Figure 5 shows a typical
product of this system, in which color
maps height.
If flight geometries are sufficiently dif-
ferent, then the parallax results in a mea-
surable displacement of image features.
Stereoscopic measurements can then be
made to also measure topography with
great accuracy.
Since SAR is essentially a narrowband
imaging system, SAR images exhibit
speckle in regions of diffuse scattering,
such as vegetation fields, gravel beds and
dirt roads. For a static target scene, if
a synthetic aperture is repeated (same
flight path and viewing angle), then the
speckle patterns will be identical in both
magnitude and phase: the speckle, in
other words, is coherent. If a region of
the scene is disturbed between the times
in which the two images are captured,
32
Optics & Photonics News
■
November 2004
SYNTHETIC APERTURE RADAR
Figure 4. Components of an optical SAR processing system: a collimated beam illuminates the
film recorded data from the left; the conical and cylindrical lenses compensate for the tilt and
separation of the range and azimuth image planes. The focused image is recorded at the plane
on the right.
November 2004
■
Optics & Photonics News
33
then the speckle coherence for that
region is destroyed. Pixel-by-pixel
coherence measurement and mapping
for the two images will display the
destroyed coherence and distinguish it
from its surroundings. This technique is
called coherent change detection. It can
be used to map vehicle tracks on a dirt
road, footprints in a grassy field and
other subtle changes otherwise indistin-
guishable in the individual SAR images
or by means of any other sensor. An
example in which footprints and mower
tracks in grass are revealed is shown
in Fig. 6.
Areas of current research and devel-
opment include foliage penetration,
ground penetration, imaging moving
vehicles, bistatic imaging (transmitting
and receiving antenna on separate vehi-
cles) and techniques for improved image
quality, particularly at long ranges, fine
resolution and for large scenes.
SAR was once termed “the sensor of
last resort.” Today’s modern high-perfor-
mance SAR systems, with their multiple
modes and unique capabilities, are
increasingly being turned to as indis-
pensable imaging tools.
Armin W. Doerry and Fred M. Dickey are both
Distinguished Members of Technical Staff at Sandia
National Laboratories, Albuquerque, N.M. They
invite the reader to visit many more examples of
SAR images, image products, programs and applica-
tions at www.sandia.gov/radar. Radar
questions can also be directed to Armin
Doerry at awdoerr@sandia.gov.
References
1. John C. Curlander and Robert N. McDonough,
Synthetic Aperture Radar, Systems & Signal Processing,
John Wiley & Sons, Inc., 1991.
2. C. V. Jakowatz Jr. et al., Spotlight-Mode Synthetic
Aperture Radar: A Signal Processing Approach, Kluwer
Academic Publishers, 1996.
3. R. O. Harger, Synthetic Aperture Radar Systems:
Theory and Design, Academic Press, New York, 1970.
4. E. N. Leith, “Quasi-Holographic Techniques in the
Microwave Region,” Proc. IEEE, 59(9), 1305-18,
1971.
5. D. A. Ausherman, Opt. Engineer. 19(2), 157-67, 1980.
6. L. J. Cutrona et al., “A High-Resolution Radar Combat-
Surveillance System, IRE Transactions on Military
Electronics, MIL-5, 127-31, 1961.
7. K. Preston Jr., Coherent Optical Computers, McGraw-
Hill Book Company, New York, 1972.
SYNTHETIC APERTURE RADAR
Figure 5. Three-dimensional rendering of IFSAR data of Albuquerque International Airport. Color maps height.
Figure 6. Coherent change detection map showing mower activity and footprints on Hardin
Field Parade Ground at Kirtland Air Force Base. Dark areas denote regions of decorrelation
caused by a disturbance to the clutter field; light areas denote no disturbance. The foliage
along the right side of the image decorrelates because of wind disturbance.
Member
Tell us what you think: http://www.osa-opn.org/survey.cfm