Conference Paper

Fundamental Limits of Almost Lossless Analog Compression

Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
DOI: 10.1109/ISIT.2009.5205734 Conference: Information Theory, 2009. ISIT 2009. IEEE International Symposium on
Source: IEEE Xplore

ABSTRACT

In Shannon theory, lossless source coding deals with the optimal compression of discrete sources. Compressed sensing is a lossless coding strategy for analog sources by means of multiplication by real-valued matrices. In this paper we study almost lossless analog compression for analog memoryless sources in an information-theoretic framework, in which the compressor is not constrained to linear transformations but it satisfies various regularity conditions such as Lipschitz continuity. The fundamental limit is shown to be the information dimension proposed by Renyi in 1959.

Full-text preview

Available from: princeton.edu
  • Source
    • "It is desirable to have this parameter as small as possible. Indeed, in [4] authors proved that ro = 1 is achievable in the asymptotic case (n → ∞). This means that the highest density ratio that an algorithm can possibly handle is γ * = rc. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we present a new approach for the analysis of iterative node-based verification-based (NB-VB) recovery algorithms in the context of compressive sensing. These algorithms are particularly interesting due to their low complexity (linear in the signal dimension n). The asymptotic analysis predicts the fraction of unverified signal elements at each iteration ℓ in the asymptotic regime where n → ∞. The analysis is similar in nature to the well-known density evolution technique commonly used to analyze iterative decoding algorithms. To perform the analysis, a message-passing interpretation of NB-VB algorithms is provided. This interpretation lacks the extrinsic nature of standard message-passing algorithms to which density evolution is usually applied. This requires a number of non-trivial modifications in the analysis. The analysis tracks the average performance of the recovery algorithms over the ensembles of input signals and sensing matrices as a function of ℓ. Concentration results are devised to demonstrate that the performance of the recovery algorithms applied to any choice of the input signal over any realization of the sensing matrix follows the deterministic results of the analysis closely. Simulation results are also provided which demonstrate that the proposed asymptotic analysis matches the performance of recovery algorithms for large but finite values of n. Compared to the existing technique for the analysis of NB-VB algorithms, which is based on numerically solving a large system of coupled differential equations, the proposed method is much simpler and more accurate.
    Full-text · Conference Paper · Sep 2011
  • Source
    • "This work was supported in part by the National Science Foundation under Grants CCF-0635154 and CCF-0728445. The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Seoul, Korea, July 2009 [55]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In Shannon theory, lossless source coding deals with the optimal compression of discrete sources. Compressed sensing is a lossless coding strategy for analog sources by means of multiplication by real-valued matrices. In this paper we study almost lossless analog compression for analog memoryless sources in an information-theoretic framework, in which the compressor or decompressor is constrained by various regularity conditions, in particular linearity of the compressor and Lipschitz continuity of the decompressor. The fundamental limit is shown to the information dimension proposed by Rényi in 1959.
    Preview · Article · Sep 2010 · IEEE Transactions on Information Theory
  • Source
    • "In fact, among the results presented in Table I, the application of the Genie and LM to (3, 4) graphs results in the lowest oversampling ratio (r o = d v /αd c ) of ≈ 1.16 and ≈ 2.51, respectively. Note that the success threshold of the Genie over regular graphs is far from the optimal achievable success threshold d v /d c proved in [4]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, we propose a general framework for the asymptotic analysis of node-based verification-based algorithms. In our analysis we tend the signal length $n$ to infinity. We also let the number of non-zero elements of the signal $k$ scale linearly with $n$. Using the proposed framework, we study the asymptotic behavior of the recovery algorithms over random sparse matrices (graphs) in the context of compressive sensing. Our analysis shows that there exists a success threshold on the density ratio $k/n$, before which the recovery algorithms are successful, and beyond which they fail. This threshold is a function of both the graph and the recovery algorithm. We also demonstrate that there is a good agreement between the asymptotic behavior of recovery algorithms and finite length simulations for moderately large values of $n$. Comment: 12 pages
    Full-text · Article · Jan 2010
Show more