January 2006
·
231 Reads
·
13,002 Citations
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
January 2006
·
231 Reads
·
13,002 Citations
January 2006
·
209 Reads
·
3,544 Citations
April 2005
·
9 Reads
·
2 Citations
Half Title Title Copyright Contents Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition
April 2005
·
35 Reads
·
18 Citations
Definitions AEP for Continuous Random Variables Relation of Differential Entropy to Discrete Entropy Joint and Conditional Differential Entropy Relative Entropy and Mutual Information Properties of Differential Entropy, Relative Entropy, and Mutual Information Summary Problems Historical Notes
April 2005
·
150 Reads
·
3 Citations
The Stock Market: Some Definitions Kuhn–Tucker Characterization of the Log-Optimal Portfolio Asymptotic Optimality of the Log-Optimal Portfolio Side Information and the Growth Rate Investment in Stationary Markets Competitive Optimality of the Log-Optimal Portfolio Universal Portfolios Shannon–McMillan–Breiman Theorem (General AEP) Summary Problems Historical Notes
April 2005
·
651 Reads
·
314 Citations
Entropy Joint Entropy and Conditional Entropy Relative Entropy and Mutual Information Relationship Between Entropy and Mutual Information Chain Rules for Entropy, Relative Entropy, and Mutual Information Jensen's Inequality and Its Consequences Log Sum Inequality and Its Applications Data-Processing Inequality Sufficient Statistics Fano's Inequality Summary Problems Historical Notes
April 2005
·
44 Reads
·
5 Citations
Basic Inequalities of Information Theory Differential Entropy Bounds on Entropy and Relative Entropy Inequalities for Types Combinatorial Bounds on Entropy Entropy Rates of Subsets Entropy and Fisher Information Entropy Power Inequality and Brunn–Minkowski Inequality Inequalities for Determinants Inequalities for Ratios of Determinants Summary Problems Historical Notes
April 2005
·
54 Reads
·
81 Citations
Method of Types Law of Large Numbers Universal Source Coding Large Deviation Theory Examples of Sanov's Theorem Conditional Limit Theorem Hypothesis Testing Chernoff–Stein Lemma Chernoff Information Fisher Information and the Cramér–Rao Inequality Summary Problems Historical Notes
April 2005
·
13 Reads
·
16 Citations
April 2005
·
71 Reads
·
2 Citations
The Horse Race Gambling and Side Information Dependent Horse Races and Entropy Rate The Entropy of English Data Compression and Gambling Gambling Estimate of the Entropy of English Summary Problems Historical Notes
... Following the organization of gene clusters, hypergeometric tests were performed for each cluster to test for enrichment of DE genes. An information theory approach [39] was adopted to infer gene expression networks within select clusters using the minet software package for R [40]. Mutual information (MI) measures the information content that two variables share: a numerical value ranging from 0 to 1 depending on, intuitively, how much knowing one variable would predict variability of the other. ...
April 2005
... The goal of authentication is now to decide whether this empirical probability is more likely to be P b (ω) or Q b (ω) and provide bounds on this decision [13]. ...
April 2005
... Finally, we estimated the temporal complexity of the brain for the whole spectrum of frequencies (1-45 Hz) by means of Lempel-Ziv Complexity (LZC) (Lempel and Ziv, 1976). This measure is based on Kolmogorov complexity and calculates the minimal "information" contained in a sequence (Cover and Thomas, 2006;Schartner et al., 2015). To analyze LZC continuous sequences, such as the EEG signal, it needs to be discretized; in this work we employed the binarization by mean-value (Zozor et al., 2005). ...
April 2005
... Our general problem here is coding of integers universally, meaning that the encoder and decode don"t have any knowledge of source statistic and probability distribution on integers that are supposed source alphabet so they don"t get any usage of it, (Davisson, 1973) and (Andreasen, 2001). It is supposed that the information source is discrete, stationary and without memory. ...
April 2005
... The theoretical fundamentals of human mobility and the background associated with the study of regularity and predictability of human mobility are introduced below. The notations, definitions, and formulas follow those presented in [54] and [2] where entropy rate has been used to quantify the extent to which an individual's travel patterns are regular and predictable. ...
October 2001
... Rate-distortion theory [25,26,27] aids in a deeper understanding in the trade off and balance between the regularization and reconstruction losses. The general rate distortion problem is formulated before making these connections. ...
October 2001
... Evidently, as shown in Eq. (3), the system model of uplink MDMA is similar to that of the uplink power domain NOMA (PD-NOMA) in terms of form. Hence, it can be easily proved that the uplink MDMA system can achieve the limit of the achievable capacity region of the MA channel (Cover and Thomas, 2001) as the PD-NOMA scheme does (Wu et al., 2018). ...
October 2001
... To evaluate the performance of the clustering algorithm, the Adjusted Rand Index (ARI) (Halkidi, Batistakis, and Vazirgiannis 2002) and Normalized Mutual Information (NMI) (Cover and Thomas 1991) are used. ARI measures the similarity between two clustering results, ranging from -0.5 to 1, with 1 indicating perfect agreement. ...
Reference:
Spectral Bridges
October 2001
... This article presents EnDED, which implements four approaches, and their combination, to indicate environmentally driven (indirect) associations in microbial networks. The four methods are sign pattern [23], overlap (developed here), interaction information [23,37], and data processing inequality [27,38]. SP requires an association score that represents co-occurrence when it is positive, and mutual exclusion when it is negative. ...
October 2001
... This PDF corresponds to the isotropic wave of maximum entropy for a fixed value of the intensity. 17 Instead of P ⌫ 0 , let us consider a reference PDF P 0 , which is still isotropic but minimizes the Kullback relative entropy K͑P a ʈ P 0 ͒. In other words, let us consider the following degree of anisotropy: ...
October 2001