A first-order analysis of lighting, shading, and shadows.
A First Order Analysis of Lighting, Shading, and Shadows
Ravi Ramamoorthi, Dhruv Mahajan and Peter Belhumeur
how the lighting varies spatially across a surface, how it varies
along different directions, the geometric curvature and reflectance
properties of objects, and the locations of soft shadows. In this
paper, we conduct a complete first order or gradient analysis of
lighting, shading and shadows, showing how each factor separately
contributes to scene appearance, and when it is important. Gradi-
ents are well suited for analyzing the intricate combination of ap-
pearance effects, since each gradient term corresponds directly to
variation in a specific factor. First, we show how the spatial and
directional gradients of the light field change, as light interacts with
curved objects. This extends the recent frequency analysis of Du-
rand et al. to gradients, and has many advantages for operations,
like bump-mapping, that are difficult to analyze in the Fourier do-
main. Second, we consider the individual terms responsible for
shading gradients, such as lighting variation, convolution with the
surface BRDF, and the object’s curvature. This analysis indicates
the relative importance of various terms, and shows precisely how
they combine in shading. As one practical application, our theoret-
ical framework can be used to adaptively sample images in high-
gradient regions for efficient rendering. Third, we understand the
effects of soft shadows, computing accurate visibility gradients. We
generalize previous work toarbitrary curved occluders, and develop
a local framework that is easy to integrate with conventional ray-
tracing methods. Our visibility gradients can be directly used in
practical gradient interpolation methods for efficient rendering.
A theoretical analysis of lighting and shading has many applica-
tions in forward and inverse rendering. For example, understanding
where the image intensity varies rapidly can be used to determine
non-uniform image sampling rates for efficient rendering. Under-
standing how shading changes in penumbra regions can lead to ef-
ficient and robust soft shadow computations, as well as advances in
inverse lighting-from-shadow algorithms. In this paper, we seek to
address these theoretical questions through a first order or gradient
analysis of lighting, shading and visibility.
The appearance of a surface, and its gradients, depends on many
factors. The shading is affected by lighting—the spatial lighting
variation over a flat object surface due to close sources, as well as
the angular variation in lighting at a point from different directions.
Shading also depends on geometric effects like the object’s curva-
ture, which determines how much the surface normal or orientation
changes between neighboring points. The material properties are
also important, since shading is effectively a convolution with the
object BRDF [Ramamoorthi and Hanrahan 2001]. These factors
can combine in complex ways in an image, and each factor may
have less or more importance depending on the situation. For ex-
ample, the spatial variation in lighting over a surface can be pri-
marily responsible for the specular reflections from a near source
on a glossy flat table. On the other hand, the angular variation in
lighting is most important for a highly curved bumpy object on the
table—the effect of spatial variation here is often small enough, that
the lighting can effectively be treated as distant (see Fig. 5).
By definition, the gradient is usually a sum of terms, each cor-
responding to variation in a specific factor. Hence, a first order
analysis is able to isolate the impact of various shading effects.
Our computation of gradients also enables new practical rendering
algorithms, such as efficient gradient-based image sampling, and
fast and accurate gradient-based interpolation of visibility (Fig. 1).
Specifically, we make the following contributions:
Analysis of Light Reflection:
ceptual steps in the reflection of light from a curved surface
(Sec. 4.1). We develop the theory for both spatial and angular (or
First, we analyze the basic con-
Figure 1: Our theoretical analysis can be applied to efficient rendering.
Top: Gradient-based image sampling achieves a 6× speedup on a scene
with bumpy, diffuse and specular objects, including shadows and near-field
lighting. Bottom: We use visibility gradients for rendering accurate soft
shadows from curved objects on the ground plane, evaluating visibility ex-
plicitly at only 1% of image pixels. More details are in Figs. 8 and 15.
directional) gradients of the light field, since many visual effects
involve a rich interplay between spatial and angular information.
Analysis of First Order Terms and Image Sampling:
Sec. 4.3, we determine the gradients for shading on a curved ob-
ject lit by general spatial and directionally-varying lighting. We
combine the basic shading steps in Sec. 4.1, augmenting them with
non-linear transformations like bump mapping (Sec. 4.2). Our final
gradient formula can be separated into individual terms that corre-
spondtoeffectslikespatiallightingvariation, angularvariation, and
surface curvature. We analyze the effects of these terms in a variety
of situations, understanding which factors are important for the ap-
pearance of different scenes (Sec. 4.5). Moreover, we show how to
extend the first order analysis to second-order Hessians (Sec. 4.7).
Section 5 (Figs. 1 and 8) applies these ideas to efficient render-
ing, by adaptively sampling images using a metric based on gra-
dient magnitude. We consider general scenes, with bump maps,
glossy reflectance, shadows, and near-field lighting, achieving ac-
curate results using only 10%−20% of the effective pixels.
Analysis of Visibility Gradients:
pressions for soft shadow gradients in Secs. 6 and 7. These have
usually been neglected in practical gradient techniques [Ward and
Heckbert 1992; Annen et al. 2004]. Our analysis is general, and
works for arbitrary curved blockers, as well as polygonal objects.
Moreover, ourformulationislocal, basedonlyonanalyzingangular
discontinuities in visibility at a single spatial location. We demon-
strate practical applications to efficient and accurate rendering of
soft shadows using gradient-based interpolation (Figs. 1 and 15).
We derive new analytic ex-
This paper builds on a substantial body of previous work on ana-
lyzing light transport in a number of different representations.
Spatial and Angular Domain
h(x,θ) = f(x,θ)g(x,θ)
h(u) = f(Mu)
h(u) = f(T(u))
?h = f?g+g?f
?h = ?f⊗g
?h(u) = MT?f(Mu)
?h(u) = JT(u)?f(T(u))
H(Ω) = F(Ω)⊗G(Ω)
H(Ωx) = F(Ωx,0)
H(Ω) = F(Ω)G(Ω)
No simple formula
Haar only, Hi= Fi0
No simple formula
No simple formula
No simple formula
h(x,θ) =?f(x,ω)g(θ −ω)dω
Figure 2: The basic mathematical operators of light transport, and the resulting transformations in gradient, Fourier and wavelet representations.
Frequency Domain Analysis:
have been popular for light field analysis, leading to a signal-
processing approach. Chai et al.  analyze light field sampling
in the Fourier domain. Ramamoorthi and Hanrahan  develop
a convolution framework for reflection on curved surfaces using
spherical harmonics. Ng  has shown how Fourier analysis
can be used to derive a slice theorem for light fields.
Most recently, and closest to our work, Durand et al.  de-
rive a frequency analysis of light transport considering both spatial
and angular variation. In Sec. 3.1 (Fig. 2), we directly compare
Fourier and gradient analysis in terms of basic mathematical oper-
ators (Sec. 8, at the end, has a more detailed discussion of specific
steps.) First order analysis has two main benefits for us. The gradi-
ent is naturally written as a sum of terms corresponding to specific
variations in shading, while keeping other factors fixed. This makes
over, first order analysis is by definition fully local and can handle
general non-linear effects like bump mapping, while Fourier analy-
sis always requires a finite neighborhood and linearization.
Wavelets have been another popular tool for
efficient computations and representation of light transport. Early
work in rendering includes wavelet radiosity [Gortler et al. 1993;
Gershbein et al. 1994]. More recently, Ng et al.  have an-
alyzed multiplication and triple product integrals using wavelets.
However, many of the mathematical operations of light transport
currently have no simple analytic interpretation in wavelets (see
Sec. 3.1 and Fig. 2). Thus, wavelets seem more useful for efficient
practical computation, rather than for deriving theoretical insights.
Differential and Gradient Analysis:
have been widely used in graphics, starting with the irradiance gra-
methods, there are some important differences. While Ward and
Heckbert  essentially try to find the gradients of the incident
light field, we seek to determine how these gradients evolve as light
interacts with object surfaces or blockers. Igehy  and Chen
and Arvo  find differentials of individual ray paths as certain
parameters (like viewpoint or location on the image plane) vary.
By contrast, we seek to determine how the gradients over the entire
light field transform. Most importantly, this paper is focused more
on theoretical analysis, understanding the nature of shading varia-
tion by considering the various gradient terms. We are optimistic
that our analysis can be used to derive new theoretical bounds and
practical algorithms for previous methods.
Shadows are one of the most important vi-
rand et al.  develop a full characterization of visibility events
in terms of the visibility complex. Soler and Sillion  and Ra-
mamoorthi et al.  have characterized special cases as convo-
lutions. Arvo  has derived irradiance Jacobians for occluded
polyhedral scenes, and applied them to shadow computations based
on a global analysis of scene configuration. Holzschuch and Sil-
lion  compute gradients and Hessians of form factors for
error analysis. By contrast, our approach is local, using only the
visibility information at a single spatial location, and can consider
general curved occluders in general complex lighting.
Frequency domain techniques
We start by writing the reflection equation at a single point x,
where B is the reflected light field, L(x,ω) is the incident light field,
ρ is the BRDF andV is the visibility. In this paper, light fields such
as B or L are expressed in terms of their spatial location x and local
angular direction (ω or θ), with respect to the local surface normal.
Our goal is a first order analysis of reflection on a curved surface.
We consider both spatial and angular gradients, because most phys-
ical phenomena involve deep interplay between spatial and angular
effects. For example, angular variation in the lighting often leads to
spatial variation in the shading on a curved object.
For much of the paper, the derivations are carried out in the 2D
plane or flatland for clarity and simplicity. While the 3D extensions
(detailed in Secs. 4.4, 7 and Appendix B) are more complicated
algebraically, much the same results are obtained. Our analysis is
applied practically to efficient rendering of 3D scenes (Sec. 5), and
to evaluation of soft shadows from curved blockers in 3D (Sec. 7).
We will be analyzing various parts and generalizations of Equa-
tion 1. In this section, we will consider abstractly the result h of
the interaction of two functions f and g, which will usually corre-
spond to the lighting and BRDF respectively. From Sec. 4 onwards,
we will be more concrete, using notation closer to Equation 1. The
partial derivatives will be denoted with subscripts—for example,
fx(x,ω) = ∂ f(x,ω)/∂x. In Sec. 3.1, we will also compare the first
order analysis to Fourier analysis, such as [Durand et al. 2005], pre-
senting a unified framework for both in terms of basic mathemat-
ical operations. Sec. 8 at the end of the paper has a more specific
discussion and comparison with examples. We denote the Fourier
transform of f(x,θ) as F(Ωx,Ωθ), where the subscripts now stand
for the spatial (x) or angular (θ) coordinate.
The interaction of lighting with the reflectance and geometry of ob-
jects involves fairly complex effects on the light field, as well as
the gradients or Fourier spectra. However, the basic shading steps
can all be reduced to five basic mathematical building blocks—
multiplication, integration, convolution of functions, and linear and
nonlinear transformations on a function’s domain. For example,
modulation of the shading by a texture map involves multiplica-
tion. Adding up the contributions of lighting from every incident
direction involves integration. The interaction of lighting and re-
flectance can usually be written as a convolution with the surface
BRDF. We will see that transformations between a global coordi-
nate frame and the local frame of the surface can be written as linear
transformations of the spatial and angular coordinates. Complex
shading effects like general bump mapping, and visibility computa-
tions require nonlinear transformations of the coordinates.
Figure 2 summarizes these mathematical operators for gradient,
Fourier and wavelet representations. While many of these formulae
are widely available in calculus textbooks, their forms give consid-
erable insight in comparing analysis with different representations.
Canonically, h(x,θ) = f(x,θ)g(x,θ).
Fourier basis, this is a convolution, H(Ω) = F(Ω)⊗G(Ω), where
the ⊗ symbol stands for convolution. For gradients,
?h = f?g+g?f.
may denote the lighting pre-multiplied by the cosine term (with the
result h(x) being the diffuse shading). After a Fourier transform,
this corresponds to restricting ourselves to the Ωθ= 0 line, i.e. Ωx
axis, so H(Ωx) = F(Ωx,0).For first order analysis,
Convolution: Canonically, h(x,θ)=?f(x,ω)g(θ−ω)dω,where
neous radially symmetric BRDF. In the Fourier basis, this becomes
Mathematical Operations of Light Transport
Consider h(x) =?f(x,θ)dθ, where for example f
f can be thought of as the incident lighting and g as the homoge-
a multiplication, H(Ω) = F(Ω)G(Ω). For gradient analysis, it is
convenient to realize that convolution is a symmetric operation.1
Thus, derivatives and convolutions commute, so that
h = f ⊗g ⇒ ?h = ?f⊗g,
where the convolution is only over the angular coordinate.
Linear Transformations: In general, we consider
h(u) = f(Mu),
where u is a n×1 vector and M is a n×n matrix. In 2D, the light
field has two dimensions, so n = 2 and u = (x,θ)T. For example, f
could be the incident light field in global coordinates, and h could
bethelightinginthelocalcoordinateframeofapoint, withM being
the appropriate transformation of u = (x,θ)T.
For Fourier analysis, we can use the general Fourier linear trans-
formation theorem. While the derivation is straightforward, it does
not appear to be commonly found in standard texts or well known
in the field, so we briefly derive it in Appendix A,
| det(M) |F(M−TΩ),
where det(M) is the determinant of M.
For gradients, we have a similar linear transformation theorem
(also derived in Appendix A). In particular,
?h(u) = MT?f(Mu).
Nonlinear Transformations: Finally, we come to nonlinear trans-
formations. These are seldom considered in analyses of light trans-
port (and are not treated by Durand et al.  at all), because it
is not clear how to handle them with Fourier or wavelet methods.
To apply gradient techniques, we effectively use the chain rule.
We assume h(u) = f(T(u)), where T is a general non-linear and
not necessarily invertible transformation. However, T can be lo-
cally linearized by computing the Jacobian, to obtain a local linear
transformation matrix J(u) (that now depends on u),
Besides relating Fourier and gradient techniques,
direct application of these formulae simplifies many derivations
both in our paper and in previous work. For example, many deriva-
tions in [Durand et al. 2005] follow directly from the Fourier linear
transformation theorem. The Fourier slice result in [Ng 2005] can
be easily derived using a combination of the linear transformation
and integration relations. Figure 2 also indicates why certain repre-
sentations are more commonly used for mathematical analysis. The
Fourier basis handles the first four basic operations in a very sim-
ple way, making it possible to conduct a full analysis of linear light
transport, such as Durand et al. . Similarly, the simple form
of those operations with gradients makes them well suited to the
analysis in this paper. Moreover, gradients are often the only avail-
able tool when considering nonlinear transformations, for which
there is no simple Fourier equivalent. For wavelets, on the other
hand, most operations like convolution or linear transforms are very
difficult to study analytically (even though there are often efficient
computational methods, such as the recent triple product multipli-
cation algorithms [Ng et al. 2004; Clarberg et al. 2005]).
4Light Reflection from Curved Surfaces
In this section, we first discuss the important conceptual steps for
reflection from a homogeneous curved object (with a brief digres-
sion to consider spatially-varying materials and analysis in 3D).
Then, we analyze non-linear transformations like normal or bump
maps, and derive the combined gradient including all effects. Fi-
nally, we analyze the effects of individual shading terms, and the
sampling of images. In this section, we do not explicitly consider
cast shadows, since visibility is analyzed in detail in Secs. 6 and 7.
1By symmetry, hθ= fθ⊗g in Equation 4 is the same as hθ= f ⊗gθ.
This symmetry no longer holds for 3D spherical convolution, where the
lighting is a 2D spherical function, while the radially symmetric BRDF is
1D.Inthatcase, wemustuse f ⊗gθ(seeAppendixB).However, Equation4
is still accurate for flatland, and can be used even for 3D sampling.
Light Field Spatial Gradient Angular Gradient
Figure 3: The light field and its spatial and angular gradients, as a result of
the various curved surface shading steps in Sec. 4.1. Green denotes positive
values, and red denotes negative values.
4.1Basic Shading Steps
To illustrate our ideas, we start with a spatially and directionally-
varying light source, showing how the light field and gradients
change with various shading steps. As shown in Fig. 3a, the source
intensity L(x,θ) varies as a Gaussian along both spatial (horizon-
tal) and angular (vertical) axes. Besides providing a simple didactic
example, one motivation is to consider spatially and directionally-
varying sources, that have rarely been studied.
We assume the global coordinate frame is aligned so the surface
normal at the origin x = 0 is pointing straight up (towards θ = 0).
The surface is parameterized by the arc-length distance x along it
(which is equivalent to the global x coordinate near x = 0 and used
interchangeably). We linearize the surface about x = 0, so that the
normal is given by kx, where k is the standard geometric curvature,
and we use positive signs for counter-clockwise directions.
Step 1—Per-Point Rotation into Local or Surface Frame:
must perform a rotation at each point to convert global coordinates
to local. Let L(x,θ) be the incident light field in the global frame.
As shown in previous work [Ramamoorthi and Hanrahan 2001; Du-
rand et al. 2005], the light field in the local or surface coordinate
frame is Ls(x,θ) = L(x,θ +n), where n is the surface normal. Not-
ing that n = kx, we write Ls(x,θ) = L(x,θ +kx). This is a linear
transformation of the variables x and θ, that mixes spatial and angu-
lar coordinates, shearing the light field along the angular dimension
as seen in Fig. 3b. If u = (x,θ)T, Ls(u) = L(Mu) with M being
Using the linear transformation theorem in Equation 7, and pre-
multiplying by MTas required,
This can be written out explicitly as
Lx(x,θ +kx)+k·Lθ(x,θ +kx)
which can be easily verified by differentiating Lsdirectly, and
where we have made the arguments for evaluation explicit. As seen
in Fig. 3b, the spatial and angular gradients are sheared in the angu-
at the sheared coordinates (x,θ +kx).
From the above equation, the angular gradients Ls
ation occurs in two ways—either the incident light field includes
spatially varying components Lx, and/or the surface has curvature k
(and there is angular lighting variation Lθ). For a distant environ-
ment map (so L is independent of x), there is no spatial variation
(Lx= 0), and Lsxis only due to curvature. For a flat surface, there is
no curvature (and in fact, Ls= L for this step), and spatial gradients
only come from the original light field. We can also see how to re-
late the two components, which have comparable magnitude when
| Lx|∼| kLθ|. This discussion also immediately shows the benefit
of first order analysis, where individual gradient terms correspond
directly to different types of shading variation.
We can now multiply by the cosine
term, with the standard multiplication formula for the gradients
(Equation 2). Since the cosine effect is relatively subtle and of-
ten rolled into Phong-like BRDFs, we will simply incorporate it in
the BRDF transport function for the combined analysis in Sec. 4.3.
Step 2–Mirror Reparameterization:
we reparameterize by the mirror direction, setting Lm(x,θ) =
Ls(x,−θ). The light field and gradients in Fig. 3c are therefore
reflected about the θ−axis. The angular gradient is also negated,
or more formally,
Step 3—BRDF Convolution:
be written as a convolution with a radially symmetric BRDF2ρ,
For gradients, we use the gradient convolution rule in Equation 4,
Since gradients and convolutions commute, we effectively obtain
gradients of the convolution by convolving the gradients,
Figure 3d shows the results of convolving with a Gaussian for ρ.
This is analogous to a Phong or an approximate Torrance-Sparrow
BRDF. We would expect the convolution to lead to some blurring
along the vertical, or angular direction, and this is in fact the case
for both the light field, and the spatial and angular gradients.
Step 4—Inverse Per-Point Rotation into Global Frame:
far, we have worked in the local or surface coordinate frame (hence,
the final result in the global frame, we should undo the original per-
point rotation, writing, analogous to Equation 10,
θhave the same
For glossy materials,
θ(x,θ) = −Ls
Reflection from the surface can
Bs(x,θ) = Lm⊗ρ =
?Bs(x,θ) = ?Lm⊗ρ =
2We use ρ(ω − θ) instead of ρ(θ − ω) for algebraic simplicity in
Sec. 4.3. Since the BRDF is symmetric, this does not matter, and is actually
more consistent with our sign conventions.
Spatially Varying Materials: As a brief aside, we consider a gen-
eralization of step 3 to spatially-varying materials.3In this case,
Note that the convolution is only over the angular coordinates,
while Lmand ρ are multiplied over the spatial coordinates. The
gradients are given by
The only additional term is Lm⊗ρxin Bsx, which corresponds to the
spatial gradient or texture in the BRDF.
An interesting special case is texture mapping, where ρ(x) sim-
ply multiplies the diffuse shading. In that case, we denote E as the
irradiance?Ls(x,ω)dω so that Bs(x) = E(x)ρ(x) and
For smooth objects, the diffuse shading is low frequency [Ra-
mamoorthi and Hanrahan 2001], so Ex is generally small and
Bsx∼ Eρx. (A similar result holds even in 3D, with ?Bs∼ E?ρ.
In 3D, the direction of the gradient ?Bsdepends primarily on the
direction of the texture gradient ?ρ, independent of the lighting or
irradiance, while the magnitude is scaled by E. This is one explana-
insensitive recognition in computer vision [Chen et al. 2000].)
Analysis in 3D: In Appendix B, we extend the four basic shading
steps to 3D. This requires simple vector calculus and differential
geometry. While the algebra is more complex, we obtain very sim-
ilar results as in the 2D or flatland case. For example, the curvature
k simply corresponds to the principal curvatures in 3D. In fact, as
we will see in Sec. 4.4, it is possible and simpler to directly use the
straightforward 3D analogs of these 2D results for real images.
4.2 Gradients for Normal or Bump Maps
Section 4.1 assumes a local linearization of the surface. We now
generalize to arbitrary normal or bump maps, which are nonlinear
transformations. In this case, the per-point rotation step involves a
general function n(x) for the normal. By differentiating, using the
chain rule (or using Equation 8, with the Jacobian of the transform),
where nx= ∂n/∂x.Hence, we can define a general per-point
curvature, k(x) = nx= ∂n/∂x, assuming an arc-length parame-
terization. For normal maps n(x) = ˆ n(x)+n0(x), where ˆ n is the
bump map, and n0(x) is the base normal of the surface. Assum-
ing the bump map has much higher frequencies than the base sur-
face, k(x) ≈ ∂ ˆ n/∂x, and depends primarily on the curvature of the
bump map. If there is no bump map, k(x) is simply the curvature
of the base surface ∂no/∂x. The use of the gradient analysis lets
us generalize to bump maps very easily, with the general function
k(x) = nx= ∂n/∂x simply taking the place of k in Equation 10.
x= Exρ +Eρx.
We now combine the four light-surface interaction steps in Sec. 4.1,
replacing kx with n(x). From Equations 17 and 14,
Upon substituting Equations 13 and 10 for Lm, we obtain
Lm(x,ω) = Ls(x,−ω) = L(x,−ω +n(x)). Hence,
L(x,−ω +n(x))ρ(ω −θ +n(x))dω
3We could also generalize the BRDF model beyond radially symmet-
ric. The gradients would be essentially the same, but with the convolutions
replaced by a general integral using the general BRDF ρ(x,θ,ω).
Light Field Gradients
B(x,θ) = Bs(x,θ −n(x)) =
Lm(x,ω)ρ(ω −θ +n(x))dω. (22)
Figure 4: Magnitudes of various light field gradient terms, corresponding
to a variety of common situations and special cases. Entries not filled in
have “normal” values, depending on the specific lighting and BRDF.
where we set ω?= n(x)−ω, and we end up with a standard convo-
lution, but evaluated at the “reflected outgoing direction,” given by
θr= 2n(x)−θ, as one might expect.
Upon making similar substitutions for the gradients (Equa-
tions 10, 13, 16 and 17), and combining the linear transforms,
Now, we can write down explicitly, using θr= 2n(x)−θ for the
reflected direction, and ⊗ for convolution,
Bθ(x,θ) = −?Lθ⊗ρ?(x,θr).
This is an overall formula for shading gradients on a curved sur-
face. While the initial derivation in Sec. 4.1 assumed the global
coordinate frame was aligned with the surface at x = 0, and used a
linearization of the surface as a conceptual tool, the final formula is
completely local, as expected for gradient analysis. We only need
the geometric curvature nxat a point, and the spatial and angular
gradients of the incident light Lxand Lθ, expressed in the local co-
ordinate or tangent frame—where x is a local arc-length parameter-
ization of thesurface. We have verifiedthese resultsfor anumber of
flatland scenes, with analytic examples and numerical evaluation.
For simplicity, we focus on homogeneous objects in this section.
However, incorporating spatial BRDF variation is straightforward.
First, consider the common case when ρ is a product of the current
angular BRDF, and a spatially-varying texture which simply mul-
tiplies the final result. We have already studied texture mapping
in Equation 20. The spatial gradient Bxinvolves a modulation of
Equation 25 by the texture, and an additional term corresponding to
the texture gradient modulated by the image intensity from Equa-
tion 23. This latter term can dominate in regions of large texture
gradients and corresponds to the observation that high-frequency
texture often masks slow shading variations. General spatially-
varying BRDFs require a generalization of the BRDF convolution
in step 3, as in Equation 19 of Sec. 4.1. The only additional term in
Equation 25 is (L⊗ρx)(x,θr), in the spatial gradient Bx.
4.4Direct Extension to 3D
While our derivations have been in 2D, one can directly use the
3D analogs of these results for many rendering applications and
analyses. A formal, accurate 3D derivation of the shading steps is
given in Appendix B and is seen to have a very similar form.
To directly extend Equation 25 to 3D, we interpret the convo-
lutions ⊗ as 3D convolutions of lighting gradients and the BRDF,
over the full sphere of incident lighting directions. The 2D curva-
ture nxis simply replaced by the Gaussian curvature of the surface.
For practical computations, the incident light field’s spatial and an-
gular gradients (corresponding to Lxand Lθ) can be determined
analytically where possible, or numerically otherwise, and usually
relate directly to the variation in intensity of the light sources.
Consider the spatial gradient Bxin 2D. In 3D, we will have two
such expressions Bxand By. For the gradient magnitude visualiza-
tions in Sec. 4.5 or the non-uniform image sampling in Sec. 5, we
?L(x,−ω +n(x))ρ(ω −θ +n(x))dω
consider the net magnitude (B2
dependent of which specific (orthogonal) directions are picked for
the axes x and y. For the angular gradients, we treat the direction θ
as a unit vector, with Bθcorresponding to two gradients along the
directions in the tangent plane to θ. Finally, we consider the net
magnitude of these angular gradients in Secs. 4.5 and 5.
y)1/2. These magnitudes are in-
We now discuss some implications of Equation 25. Figure 4 shows
a number of common situations. To aid our discussion, we label
the important terms. We refer to Lx⊗ρ as the spatial variation
(SV) term in the lighting. Analogously, Lθ⊗ρ is the directional
variation (DV) term—the directional variation in the reflected light
field Bθis essentially the same as the incident DV. We refer to 2nx
as the curvature (CV) term, and the product 2nx(Lθ⊗ρ) as the
curvature directional variation (CDV) term. Spatial gradients in
the reflected light field Bxare a combination of SV and CDV terms.
We first describe how various factors (lighting, geometry and
materials) affect shading gradients. Fig. 4 summarizes our insights.
Then, we use a simple 3D scene to illustrate some of these effects.
In distant lighting, there is no spatial lighting variation
SV (Lx= 0), and spatial gradients Bxare due solely to the curva-
ture and angular lighting variation (CDV). If the environment itself
varies little (low DV, small | Lθ|), such as an overcast sky, we get
soft shading effects with little spatial variation (| Bx| is small). On
the other hand, for a near light source, there is significant spatial
variation (large Lx), and both SV and CDV must be considered.
A bump-mapped surface has high curvature, so the
directional term CDV will be large, and the main contributor to
Bx. On the other hand, a flat surface has no curvature, so the CDV
term vanishes, and only the spatial variation Lxin the lighting can
induce shading changes. A particularly interesting special case is
a flat surface in a distant environment map. In this case, we get
uniform shading across the surface, and indeed Bx= 0.
bertian object (or the diffuse lobe of a general material), the BRDF
ρ is a low-pass filter that causes the directional shading DV to be
low-frequency and smooth. Hence, strong spatial gradients in the
lighting (the SV term) can often be important to the overall shad-
ing. Moreover, we know that sharp edges cannot come from the
DV term, and will either be at geometric discontinuities (very high
curvature) or because of strong spatial variation in lighting. On the
other hand, for a mirror surface, like a chrome-steel sphere often
used to estimate the illumination, we will see the full directional
variation in the lighting, and DV will be high.
We can also make some quantitative statements. The spatial term
SV and directional CDV will be of roughly the same magnitude
formalized as |Lx|?2|nx||Lθ|. In the simple case when the near
light source(s) is isotropic and at a distance d, from trigonometry,
Lx≈ Lθ/d so the condition for far lighting becomes 1/d ? 2 | nx|,
which relates the distance of the lighting to the surface curvature.
This criterion depends on the curvature—a light source that is far
for a bump-mapped object may not be classified as far for a flat
table. One application is efficient rendering approximation, where
light sources could be treated as distant for bump-mapped or other
high-curvature surfaces, while being modeled exactly in flat regions
based on these criteria. There are similar applications for inverse
problems and perception—it will be relatively easier to estimate
near-field lighting effects from flatter objects than curved surfaces.
We illustrate some of these ideas with a simple didactic 3D scene
in Fig. 5 that includes a nearly (but not with zero curvature) flat
table on which sit a diffuse, diffuse+glossy, and bumpy sphere. The
scene is lit by a moderately close area source. We use the direct 3D
analogs of the 2D gradients, as discussed in Sec. 4.4.
5e. The spatial gradient of the (moderately near) lighting (b) can be
large, and is primarily responsible for the highlight on the (nearly)
flat table. Indeed, CDV is very low on the table, while being highest
on the bumpy sphere. CDV is also responsible for effects like the
specular highlight on the glossy sphere. Figure 5e plots the ratio
Implications: Analysis of Gradient Terms