ABSTRACT: This paper supplements the author’s paper . We obtain an explicit formula which in a special case allows us to calculate
the maximum of mutual information of several random variables via the variational distance between the joint distribution
of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary
entropy function, which are related to the problem considered here.
Problems of Information Transmission 04/2012; 46(2):122-126. · 0.48 Impact Factor