Conference Paper

Index Coding with Side Information

Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa
DOI: 10.1109/FOCS.2006.42 Conference: Foundations of Computer Science, 2006. FOCS '06. 47th Annual IEEE Symposium on
Source: IEEE Xplore


Motivated by a problem of transmitting data over broadcast channels (BirkandKol, INFOCOM1998), we study the following coding problem: a sender communicates with n receivers Rl,.., Rn. He holds an input x isin {0, 1}n and wishes to broadcast a single message so that each receiver Ri can recover the bit xi. Each Ri has prior side information about x, induced by a directed graph G on n nodes; Ri knows the bits of x in the positions {j | (i, j) is anedge of G}. We call encoding schemes that achieve this goal INDEX codes for {0, 1} n with side information graph G. In this paper we identify a measure on graphs, the minrank, which we conjecture to exactly characterize the minimum length of INDEX codes. We resolve the conjecture for certain natural classes of graphs. For arbitrary graphs, we show that the minrank bound is tight for both linear codes and certain classes of non-linear codes. For the general problem, we obtain a (weaker) lower bound that the length of an INDEX code for any graph G is at least the size of the maximum acyclic induced subgraph of G

17 Reads
  • Source
    • "A linear index code is one whose encoding function is linear and it is linearly decodable if all the decoding functions are linear. It was shown in [3] that for the class of index coding problems over F 2 which can be represented using side information graphs, which were labeled later in [4] as single unicast index coding problems, the length of optimal linear index code is equal to the minrank over F 2 of the corresponding side information graph. This was extended in [6] to general index coding problems, over F q , using minrank over F q of their corresponding side information hypergraphs. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper discusses noisy index coding problem over Gaussian broadcast channel. We propose a technique for mapping the index coded bits to M-QAM symbols such that the receivers whose side information satisfies certain conditions get coding gain, which we call the QAM side information coding gain. We compare this with the PSK side information coding gain, which was discussed in "Index Coded PSK Modulation," arXiv:1356200, [cs.IT] 19 September 2015.
  • Source
    • "The problem of index coding with side information was introduced by Birk and Kol [2]. Bar-Yossef et al. [3] studied a type of index coding problem in which each receiver demands only one single message and the number of receivers equals number of messages. Ong and Ho [5] classify the binary index coding problem depending on the demands and the side information possessed by the receivers. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper deals with scalar linear index codes for canonical multiple unicast index coding problems where there is a source with K messages and there are K receivers each wanting a unique message and having symmetric (with respect to the receiver index) antidotes (side information). Optimal scalar linear index codes for several such instances of this class of problems have been reported in \cite{MRRarXiv}. These codes can be viewed as special cases of the symmetric unicast index coding problems discussed in \cite{MCJ}. In this paper a lifting construction is given which constructs a sequence of multiple unicast index problems starting from a given multiple unicast index coding problem. Also, it is shown that if an optimal scalar linear index code is known for the problem given starting problem then optimal scalar linear index codes can be obtained from the known code for all the problems arising from the proposed lifting construction. For several of the known classes of multiple unicast problems our construction is used to obtain several sequences of multiple unicast problem with optimal scalar linear index codes.
  • Source
    • "codewords, x (n) k , generated according to an independent normal distribution X k ∼ N (0, α k P ), where α k ≥0 and k α k = 1 to satisfy the transmission power constraint. Multiplexing coding [16] and index coding [17] are employed to construct the subcodebooks. In multiplexing coding, two or more messages are first bijectively mapped to a single message, and then, the codewords are generated for this message. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper investigates the capacity region of the three-receiver AWGN broadcast channel where the receivers (i) have private-message requests, and (ii) may know some of the messages requested by other receivers as side information. We first classify all 64 possible side information configurations into eight groups, each consisting of eight members.We next construct transmission schemes, and derive new inner and outer bounds for the groups. This establishes the capacity region for 52 out of 64 possible side information configurations. For six groups (i.e., groups 1, 2, 3, 5, 6, and 8 in our terminology), we establish the capacity region for all their members, and show that it tightens both the best known inner and outer bounds. For group 4, our inner and outer bounds tighten the best known inner bound and/or outer bound for all the group members. Moreover, our bounds coincide at certain regions, which can be characterized by two thresholds. For group 7, our inner and outer bounds coincide for four members, thereby establishing the capacity region. For the remaining four members, our bounds tighten both the best known inner and outer bounds.
    IEEE Transactions on Information Theory 07/2015; DOI:10.1109/TIT.2015.2463277 · 2.33 Impact Factor
Show more

Preview (2 Sources)

17 Reads
Available from