Conference Paper

Remote Source Coding Under Gaussian Noise: Dueling Roles of Power and Entropy-Power

If you want to read the PDF, try requesting it from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... For arbitrary discrete source distributions and the logarithmic loss distortion measure, Courtade and Weissman completely characterized the CEO rate region in [13]. Moreover, asymptotic analyses for an infinite number of sensors have been performed in [14], [15]. Berger et al. investigated the error rate performance for a discrete source with the Hamming distance as a distortion measure and exhibited an inevitable loss due to non-cooperating sensors [14]. ...
... Berger et al. investigated the error rate performance for a discrete source with the Hamming distance as a distortion measure and exhibited an inevitable loss due to non-cooperating sensors [14]. A scaling law on the sum-rate distortion function for arbitrary distortion measures has been derived in [15]. Despite of the large recent progress, rate regions have been characterized only for particular distributions of the relevant signal and specific distortion measures and are generally still unknown. ...
... which has to be maximized. Pursuing the greedy optimization approach, the objective in (15) can be decomposed into M target functions being successively maximized. ...
Article
Full-text available
This paper addresses the optimization of distributed compression in a sensor network. A direct communication among the sensors is not possible so that noisy measurements of a single relevant signal have to be locally compressed in order to meet the rate constraints of the communication links to a common receiver. This scenario is widely known as the Chief Executive Officer (CEO) problem and represents a long-standing problem in information theory. In recent years significant progress has been achieved and the rate region has been completely characterized for specific distributions of involved processes and distortion measures. While algorithmic solutions of the CEO problem are principally known, their practical implementation quickly becomes challenging due to complexity reasons. In this contribution, an efficient greedy algorithm to determine feasible solutions of the CEO problem is derived using the information bottleneck (IB) approach. Following the Wyner-Ziv coding principle, the quantizers are successively designed using already optimized quantizer mappings as side-information. However, processing this side-information in the optimization algorithm becomes a major bottleneck because the memory complexity grows exponentially with number of sensors. Therefore, a sequential compression scheme leading to a compact representation of the side-information and ensuring moderate memory requirements even for larger networks is introduced. This internal compression is optimized again by means of the IB method. Numerical results demonstrate that the overall loss in terms of relevant mutual information can be made sufficiently small even with a significant compression of the side-information. The performance is compared to separately optimized quantizers and a centralized quantization. Moreover, the influence of the optimization order for asymmetric scenarios is discussed.
... and (6b), for some joint distribution (7), where [x] + = max{0, x} and U A ↔ Y A ↔ X ↔ Y A c ↔ U A c forms a Markov chain for any A ⊆ I L . ...
... and D th = H(X, U I L ) − H(U I L ) = H(X) + H(U I L |X) − H(U I L ) = H(X) + H(N I L ⊕ Z I L ) − H(U I L ) = 1 + L l=1 h b (P l ) − H(U I L ),7 Generally, there is a compound LDGM-LDPC code in the first and the L-th link; and there are two compound codes in the i-th link, for i ∈ [2 : L − 1]. Thus, there are totally 2L − 2 compound codes in an L-link case. ...
Article
The L-link binary Chief Executive Officer (CEO) problem under logarithmic loss is investigated in this paper. A quantization splitting technique is applied to convert the problem under consideration to a (2L−1)-step successive Wyner-Ziv (WZ) problem, for which a practical coding scheme is proposed. In the proposed scheme, Low-Density Generator-Matrix (LDGM) codes are used for binary quantization while Low-Density Parity-Check (LDPC) codes are used for syndrome generation; the decoder performs successive decoding based on the received syndromes and produces a soft reconstruction of the remote source. The simulation results indicate that the rate-distortion performance of the proposed scheme can approach the theoretical inner bound based on binary-symmetric test-channel models.
... In contrast to the jointly Gaussian CEO problem, non-Gaussian and non-quadratic CEO problems have received less attention due to limited analytic tractability compared with the Gaussian case. A non-regular sourceobservation pair such as copula model or truncated Gaussian noise was considered under quadratic distortion [7], and a general continuous source with additive Gaussian noise was considered under quadratic distortion and general distortion [8]. Toward generalization of source-observation pair, it was shown that Gaussianity is in fact the worst [9]. ...
Preprint
The CEO problem has received a lot of attention since Berger et al.~first investigated it, however, there are limited results on non-Gaussian models with non-quadratic distortion measures. In this work, we extend the CEO problem to two continuous alphabet settings with general $r$th power of difference and logarithmic distortions, and study asymptotics of distortion as the number of agents and sum rate grow without bound. The first setting is a regular source-observation model, such as jointly Gaussian, with difference distortion and we show that the distortion decays at $R_{\textsf{sum}}$ up to a multiplicative constant. We use sample median estimation following the Berger-Tung scheme for achievability and the Shannon lower bound for the converse. The other setting is a non-regular source-observation model, such as copula or uniform additive noise models, with difference distortion for which estimation-theoretic regularity conditions do not hold. The optimal decay $R_{\textsf{sum}}$ is obtained for the non-regular model by midrange estimator following the Berger-Tung scheme for achievability and the Chazan-Ziv-Zakai bound for the converse. Lastly, we provide a condition for the regular model, under which quadratic and logarithmic distortions are asymptotically equivalent by entropy power relation as the number of agents grows. This proof relies on the Bernstein-von Mises theorem.
Article
Full-text available
This paper addresses the optimization of distributed compression in a sensor network with partial cooperation among sensors. The widely known CEO problem, where each sensor has to compress its measurements locally in order to forward them over capacity limited links to a common receiver is extended by allowing sensors to mutually communicate. This extension comes along with modified statistical dependencies among involved random variables compared to the original CEO problem, such that well-known outer and inner bounds do not hold anymore. Three different inter-sensor communication protocols are investigated. The successive broadcast approach allows each sensor to exploit instantaneous side-information of all previously transmitting sensors. As this leads to dimensionality problems for larger networks, a sequential point-to-point communication scheme is considered forwarding instantaneous side-information to only one successor. Thirdly, a two-phase transmission protocol separates the information exchange between sensors and the communication with the common receiver. Inspired by algorithmic solutions for the original CEO problem, the sensors are optimized in a greedy manner. It turns out that partial communication among sensors improves the performance significantly. In particular, the two-phase transmission can reach the performance of a fully cooperative CEO scenario, where each sensor has access to all measurements and the knowledge about all channel conditions. Moreover, exchanging instantaneous side-information increases the robustness against bad Wyner–Ziv coding strategies, which can lead to significant performance losses in the original CEO problem.
Article
We consider systems in which the transmitter conveys messages to the receiver through a capacity-limited relay station. The channel between the transmitter and the relay-station is assumed to be a frequency selective additive Gaussian noise channel. It is assumed that the transmitter can shape the spectrum and adapt the coding technique so as to optimize performance. The relay operation is oblivious (nomadic transmitters), that is, the specific codebooks used are unknown. We find the reliable information rate that can be achieved with Gaussian signaling in this setting, and to that end, employ Gaussian bottleneck results combined with Shannon’s incremental frequency approach. We also prove that, unlike classical water-pouring, the allocated spectrum (power and bit-rate) of the optimal solution could frequently be discontinuous. These results can be applied also to a MIMO transmission scheme. We also investigate the case of an entropy limited relay. We show that the optimal relay function is always deterministic, present lower and upper bounds on the optimal performance (in terms of mutual information), and derive an analytical approximation.
Chapter
Half-title pageSeries pageTitle pageCopyright pageDedicationPrefaceAcknowledgementsContentsList of figuresHalf-title pageIndex
Article
A single memoryless Gaussian source is observed by many terminals, subject to independent Gaussian observation noises. The terminals are linked to a fusion center via a standard Gaussian multiple-access channel. The fusion center needs to recover the underlying Gaussian source with respect to mean-squared error. In this correspondence, a theorem of Witsenhausen is shown to imply that an optimal communication strategy is uncoded transmission, i.e., each terminal's channel input is merely a scaled version of its noisy observation.
Chapter
This chapter introduces the concept of differential entropy, which is the entropy of a continuous random variable. Differential entropy is also related to the shortest description length, and is similar in many ways to the entropy of a discrete random variable. But there are some important differences, and there is need for some care in using the concept.
Article
A firm's CEO employs a team of L agents who observe independently corrupted versions of a data sequence {X(t)}<sub>t=1</sub><sup>∞ </sup>. Let R be the total data rate at which the agents may communicate information about their observations to the CEO. The agents are not allowed to convene. Berger, Zhang and Viswanathan (see ibid., vol.42, no.5, p.887-902, 1996) determined the asymptotic behavior of the minimal error frequency in the limit as L and R tend to infinity for the case in which the source and observations are discrete and memoryless. We consider the same multiterminal source coding problem when {X(t)}<sub>t=1</sub><sup>∞</sup> is independent and identically distributed (i.i.d.) Gaussian random variable corrupted by independent Gaussian noise. We study, under quadratic distortion, the rate-distortion tradeoff in the limit as L and R tend to infinity. As in the discrete case, there is a significant loss between the cases when the agents are allowed to convene and when they are not. As L→∞, if the agents may pool their data before communicating with the CEO, the distortion decays exponentially with the total rate R; this corresponds to the distortion-rate function for an i.i.d. Gaussian source. However, for the case in which they are not permitted to convene, we establish that the distortion decays asymptotically only as R-l