Joint Source-Channel Coding with Correlated Interference
ABSTRACT We study the joint source-channel coding problem of transmitting a
discrete-time analog source over an additive white Gaussian noise (AWGN)
channel with interference known at transmitter.We consider the case when the
source and the interference are correlated. We first derive an outer bound on
the achievable distortion and then, we propose two joint source-channel coding
schemes. The first scheme is the superposition of the uncoded signal and a
digital part which is the concatenation of a Wyner-Ziv encoder and a dirty
paper encoder. In the second scheme, the digital part is replaced by the hybrid
digital and analog scheme proposed by Wilson et al. When the channel
signal-tonoise ratio (SNR) is perfectly known at the transmitter, both proposed
schemes are shown to provide identical performance which is substantially
better than that of existing schemes. In the presence of an SNR mismatch, both
proposed schemes are shown to be capable of graceful enhancement and graceful
degradation. Interestingly, unlike the case when the source and interference
are independent, neither of the two schemes outperforms the other universally.
As an application of the proposed schemes, we provide both inner and outer
bounds on the distortion region for the generalized cognitive radio channel.
Article: Writing on dirty paper.IEEE Transactions on Information Theory. 01/1983; 29:439-441.
- [show abstract] [hide abstract]
ABSTRACT: In this paper we generalize (to nondiscrete sources) the results of a previous paper (Wyner and Ziv, 1976) on source coding with a fidelity criterion in a situation where the decoder (but not the encoder) has access to side information about the source. We define R*(d) as the minimum rate (in the usual Shannon sense) required for encoding the source at a distortion level about d. The main result is the characterization of R*(d) by an information theoretic minimization. In a special case in which the source and the side information are jointly Gaussian, it is shown that R*(d) is equal to the rate which would be required if the encoder (as well as the decoder) is informed of the side information.Information and Control. 07/1978;
- [show abstract] [hide abstract]
ABSTRACT: Cognitive radio promises a low-cost, highly flexible alternative to the classic single-frequency band, single-protocol wireless device. By sensing and adapting to its environment, such a device is able to fill voids in the wireless spectrum and can dramatically increase spectral efficiency. In this paper, the cognitive radio channel is defined as a two-sender, two-receiver interference channel in which sender 2 obtains the encoded message sender 1 plans to transmit. We consider two cases: in the genie-aided cognitive radio channel, sender 2 is noncausally presented the data to be transmitted by sender 1 while in the causal cognitive radio channel, the data is obtained causally. The cognitive radio at sender 2 may then choose to transmit simultaneously over the same channel, as opposed to waiting for an idle channel as is traditional for a cognitive radio. Our main result is the development of an achievable region which combines Gel'fand-Pinkser coding with an achievable region construction for the interference channel. In the additive Gaussian noise case, this resembles dirty-paper coding, a technique used in the computation of the capacity of the Gaussian multiple-input multiple-output (MIMO) broadcast channel. Numerical evaluation of the region in the Gaussian noise case is performed, and compared to an inner bound, the interference channel, and an outer bound, a modified Gaussian MIMO broadcast channel. Results are also extended to the case in which the message is causally obtained.IEEE Transactions on Information Theory 06/2006; · 2.62 Impact Factor
arXiv:1009.0304v2 [cs.IT] 28 Feb 2011
Joint Source-Channel Coding with Correlated
Yu-Chih Huang and Krishna R. Narayanan
Department of Electrical and Computer Engineering
Texas A&M University
In this paper, we study the joint source-channel coding problem of transmitting a discrete-time analog source over an additive
white Gaussian noise (AWGN) channel with interference known at transmitter. We consider the case when the source and the
interference are correlated. We first derive an outer bound on the achievable distortion and then, we propose two joint source-channel
coding schemes to make use of the correlation between the source and the interference. The first scheme is the superposition of
the uncoded signal and a digital part which is the concatenation of a Wyner-Ziv encoder and a dirty paper encoder. In the second
scheme, the digital part is replaced by a hybrid digital and analog scheme so that the proposed scheme can provide graceful
degradation in the presence of (signal-to-noise ratio) SNR mismatch. Interestingly, unlike the independent interference setup, we
show that neither of both schemes outperform the other universally in the presence of SNR mismatch. These coding schemes are
further utilized to obtain the achievable distortion region of the generalized cognitive radio channels.
Distortion region, joint source-channel coding, cognitive radios.
I. INTRODUCTION AND PROBLEM STATEMENT
In this paper, we consider transmitting a length-n i.i.d. zero-mean Gaussian source Vn= (V (1),V (2),...,V (n)) over n
uses of an additive white Gaussian noise (AWGN) channel with noise Zn∼ N(0,N·I) in the presence of Gaussian interference
Snwhich is known at the transmitter as shown in Fig. 1. Throughout the paper, we only focus on the bandwidth-matched
case, i.e., the number of channel uses is equal to the source’s length. The transmitted signal Xn= (X(1),X(2),...,X(n))
is subject to a power constraint
where E[·] represents the expectation operation. The received signal Ynis given by
Yn= Xn+ Sn+ Zn.
E[X(i)2] ≤ P,
We are interested in the expected distortion between the source and the estimate?Vnat the output of the decoder given by
where f and g are a pair of source-channel coding encoder and decoder, respectively, and d(.,.) is the mean squared error
(MSE) distortion measure given by
d(v, ˆ v) =1
Here the lower case letters represent realizations of random variables denoted by upper case letters. As in , a distortion D
is achievable under power constraint P if for any ε > 0, there exists a source-channel code and a sufficiently large n such that
d ≤ D + ε.
When V and S are uncorrelated, it is known that an optimal quantizer followed by a Costa’s dirty paper coding (DPC) 
is optimal and the corresponding joint source-channel coding problem is fully discussed in . However, different from the
d = E[d(Vn,g(f(Vn,Sn) + Sn+ Zn))],
(v(i) − ˆ v(i))2.
Fig. 1.Joint source-channel coding with interference known at transmitter.
typical writing on dirty paper problem, in this paper, we consider the case where the source and the interference are correlated
with a covariance matrix given by
Under this assumption, separate source and channel coding using DPC naively may not be a good candidate for encoding
Vnin general. It is due to the fact that the DPC tries to completely avoid the interference without signal to noise ratio (SNR)
penalty so that it cannot take advantage of the it correlation between the source and the interference. In this paper, we first
derive an outer bound on the achievable distortion region and then, we propose two joint source-channel coding schemes which
exploit the correlation between Vnand Sn, thereby outperforming the naive DPC scheme. The first scheme is a superposition
of the uncoded scheme and a digital part formed by a Wyner-Ziv coding  followed by a DPC, which we refer to as a
superposition-based scheme with digital DPC (or just the superposition-based scheme). The second scheme is obtained by
replacing the digital part by a hybrid digital and analog (HDA) scheme given in  that has been shown to provide graceful
degradation under an SNR mismatch. We then analyze the performance of these two proposed schemes for SNR mismatch
cases. It is shown that both the HDA scheme and the superposition-based digital scheme benefit from a higher SNR; however,
interestingly, their performances are different.
One interesting application of this problem is to derive the achievable distortion region for the generalized cognitive radio
channels considered in  (also in ). This channel can be modeled as a typical two-user interference channel except that one
of them knows exactly what the other plans to transmit. We can regard the informed user’s channel as the setup we consider
in this section and then analyze achievable distortion regions for several different cases.
The rest of the paper is organized as follows. In section II, we present some prior works which are closely related to ours.
The outer bound is given in section III and two proposed schemes are given in section IV. In section V, we analyze the
performance of the proposed schemes under SNR mismatch. These proposed schemes are then extended to the generalized
cognitive radio channels in section VI. Some conclusions are given in VII.
II. RELATED WORKS ON JSCC WITH INTERFERENCE KNOWN AT TRANSMITTER
In , Lapidoth et al. consider the 2×1 multiple access channel in which two transmitters wish to communicate their sources,
which are drawn from a bi-variate Gaussian distribution, to a receiver which is interested in reconstructing both sources. There
are some similarities between the work in  and here. However, an important difference is that the transmitters are not allowed
to cooperate with each other, i.e., for the particular transmitter, the interference is not known.
In , Tian et al. consider transmitting a bi-variate Gaussian source over 1×2 Gaussian Broadcast Channel. In their setup,
the source consisting of two components Vn
receiver is only interested in one part of the sources. They proposed a HDA scheme which performs optimally in terms of
distortion region under all SNRs. At first glance, this problem is again similar to ours if we ignore receiver 2 and focus on
the other. Then this problem reduces to communicating Vn
crucial difference is that this side-information does not appear in the received signal.
Joint source-channel coding for point to point communications over Gaussian channels has been widely discussed. e.g. ,
, . However, they either don’t consider interference (, ) or assume independence of source and interference ().
In , Wilson et al. proposed a HDA coding scheme for the typical writing on dirty paper problem in which the source is
independent of the interference. This HDA scheme is originally proposed to perform well in the case of a SNR mismatch. In
, the authors showed that their HDA scheme not only achieves the optimal distortion in the absence of SNR mismatch but
also provides gracefully degradation in the presence of SNR mismatch. In the following sections, we will discuss this scheme
in detail and then propose a coding scheme based on this one.
From now on, since all the random variables we consider are i.i.d. in time, i.e. V (i) is independent of V (j) for i ?= j, we
will drop the index i for the sake of convenience.
1 and Vn
2 memoryless and stationary bi-variate Gaussian distributed and each
1 with correlated side-information Vn
2 given at the transmitter. A
III. OUTER BOUNDS
A. Outer Bound 1
For comparison, we first present a genie-aided outer bound. This outer bound is derived in a similar way to the one in 
in which we assume that S is revealed to the decoder by a genie. Thus, we have
V(1 − ρ2)
≤ I(V ;?V |S)
≤ I(V ;Y |S)
= h(Y |S) − h(Y |S,V )
= h(X + Z|S) − h(Z)
≤ h(X + Z) − h(Z)
where (a) follows from the rate-distortion theory , (b) is from the data processing inequality, (c) is due from that conditioning
reduces differential entropy and (d) comes from the fact that Gaussian density maximizes the differential entropy. Therefore,
we have the outer bound as
1 + P/N
V(1 − ρ2)
Note that this outer bound is in general not tight for our setup since in the presence of correlation, giving S to the decoder
also offers a correlated version of the source that we wish to estimate. For example, in the case of ρ = 1, giving S to the
decoder implies that the outer bound is Dob= 0 no matter what the received signal Y is. On the other hand, if ρ = 0, the
setup reduces to the one with uncorrelated interference and we know that this outer bound is tight. Now, we present another
outer bound that improves this outer bound for some values of ρ.
B. Outer Bound 2
Since S and V are drawn from a jointly Gaussian distribution with covariance matrix given in (5), we can write
S = ρσS
V + Nρ,
where Nρ∼ N?0,(1 − ρ2)σ2
?and is independent to V . Now, suppose a genie reveals only Nρto the decoder, we have
≤ I(V ;?V |Nρ)
≤ I(V ;Y |Nρ)
= h(Y |Nρ) − h(Y |Nρ,V )
= h(X + ρσS
≤ h(X + ρσS
V + Z|Nρ) − h(Z)
V + Z) − h(Z)
1 +(√P + ρ?σ2
X + ρσS
σVV + Z
where (a)-(d) follow from the same reasons with those in the previous outer bound and (e) is due from the Cauchy-Schwartz
inequality that states that the maximum occurs when X and V are collinear. Thus, we have
1 + (√P + ρ?σ2
Note that although the encoder knows the interference S exactly instead of just Nρ, the outer bound is valid since S is a
function of V and Nρ.
Remark 1: If ρ = 0, this outer bound reduces to the previous one and is tight. If ρ = 1, the genie actually reveals nothing
to the decoder and the setup reduces to the one considered in  that the encoder is interested in revealing the interference
to the decoder. For this case, we know that this outer bound is tight. However, this outer bound is in general optimistic except
for two extremes. It is due to the fact that in derivations, we assume that we can simultaneously ignore the Nρand use all the
power to take advantage of the coherent part. Despite this, the outer bound still provides an insight that in order to build a
good coding scheme that one should try to use a portion of power to make use of the correlation and then use the remaining
power to avoid Nρ.
Further, it is natural to combine these two outer bounds as
IV. PROPOSED SCHEMES
A. Uncoded Scheme
We first analyze the distortion of the uncoded scheme where the transmitted signal is simply the scaled version of the source
Thus, (2) becomes
V + S + Z.
The receiver forms the linear MMSE estimate of V from Y as?V = βY , where
P + σ2
S+ N + 2?P/σ2
The corresponding distortion is then given as
1 − β(
Remark 2: If ρ = 1 and σ2
transmitting V over an AWGN channel Z with power constraint (√P +?σ2
channel state S to the receiver. In , the authors have shown that the pure amplification (uncoded) scheme is optimal for
this problem. Therefore, we can expect that the uncoded scheme will eventually achieve the optimal distortion when ρ = 1.
S, the source and the interference are exactly the same and the problem reduces to
V)2. From  , we know that the uncoded
scheme is optimal for this case. One can also think of this scenario as that the transmitter is only interested in revealing the
B. Naive DPC Scheme
Another existing scheme is the concatenation of a optimal source code and a DPC. The optimal source code quantizes the
analog source with a rate arbitrarily close to the channel capacity 1/2log(1 + P/N). Then, the DPC ignores the correlation
between the source and interference (this can be done by a randomization and de-randomization pair) and encodes the
quantization output accordingly. Since the DPC achieves the rate equal to that when there is no interference at all, the receiver
can correctly decode these digital bits with high probability. By the rate-distortion theory, we have the corresponding distortion
1 + P/N.
Remark 3: In the absence of correlation, i.e., ρ = 0, the problem reduces to the typical writing on dirty paper setup and it
is known that this scheme is optimal but the uncoded scheme is strictly suboptimal. Therefore, we can expect that when the
correlation is small, this naive DPC scheme will outperform the uncoded scheme.
C. Superposition-Based Scheme with Digital DPC
We now propose a superposition-based scheme which retains the advantages of the above two schemes. This scheme can
be regarded as an extended version of the coding scheme in  to the setup we consider. As shown in Fig. 2, the transmitted
signal of this scheme is the superposition of the analog part Xawith power Paand the digital part Xdwith power P − Pa.
The motivation here is to allocate some power for the analog part to make use of the interference which is somewhat coherent
to the source for large ρ’s and to assign more power to the digital part to avoid the interference when ρ is small. The analog
part is the scaled version of linear combination of source and interference as
Xa=√a(γV + (1 − γ)S),
where Pa∈ [0,P], a = Pa/σ2
a, γ ∈ [0,1] and
V+ (1 − γ)2σ2
S+ 2γ(1 − γ)ρσVσS.
The received signal is given by
Y = Xd+ Xa+ S + Z
= Xd+√a(γV + (1 − γ)S) + S + Z
= Xd+√aγV +?1 +√a(1 − γ)?S + Z
= Xd+ S′+ Z,
Fig. 2.Superposition-based scheme.
where Xdis chosen to be orthogonal to S and V . The receiver first makes an estimate from Y only as V′= βY with
V+ (1 − γ)ρσVσS) + ρσVσS
P + N + σ2
The corresponding MSE is
Thus, we can write V = V′+ W with W ∼ N(0,D∗).
We now refine the estimate through the digital part, which is the concatenation of a Wyner-Ziv coding and a DPC. Since
the DPC achieves the rate equal to that when there is no interference at all, the encoder can use the remaining power P −Pa
to reliably transmit the refining bits T with a rate arbitrarily close to
The resulting distortion after refinement is then given as
S+ 2√a((1 − γ)σ2
?√a(γ + (1 − γ)ρσS
1 − β
) + ρσS
1 +P − Pa
In Appendix A, for self-containedness, we briefly summarize the digital Wyner-Ziv scheme to illustrate how to achieve the
It is worth noting that setting γ = 1 gives us the lowest distortion always. i.e., super-imposing S onto the transmitted signal
is completely unnecessary. However, it is in general not true for the cognitive radio setup. We will discuss this in detail in
Remark 4: Different from the setup considered in  that the optimal distortion can be achieved by any power allocation
between coded and uncoded transmissions, in our setup the optimal distortion is in general achieved by a particular power
allocation which is a function of ρ. For example, in the absence of correlation, i.e., S is completely independent to V , one
can simply set Pa= 0 and this scheme reduces to the naive DPC which is optimal in this case. On the other hand, if ρ = 1,
the optimal distortion is achieved by setting Pa= P. Moreover, for ρ > 0, it is beneficial to have a non-zero Pamaking use
of the correlation between the source and the interference.
D. HDA Scheme
Now, let us focus on the HDA scheme shown in Fig. 3 obtained by replacing the digital part in Fig. 2 by the HDA scheme
given in . The analog signal remains the same as (17) and the HDA output is referred to as Xh. Therefore, we have
Y = Xh+√aγV +?1 +√a(1 − γ)?S + Z
Again, the HDA scheme regards S′as interference and V′described previously as side-information. The encoding and decoding
procedures are similar to that in  but the coefficients need to be re-derived to fit our setup (the reader is referred to  for
Let the auxiliary random variable U be
U = Xh+ αS′+ κV,
= Xh+ S′+ Z.
where Xh∼ N(0,Ph) independent to S′and V and Ph= P − Pa. The covariance matrix of S′and V can be computed by