Illustration of the lower bound on the effective identification rate ¯ R eff provided in (14) for target identification rates κ = 0, 0.1, 0.2 and codeword length n = 10 4 . The ISI rate l that yields the maximum value for each value of l is marked by a yellow star and coincides with the optimal lmax provided in (15).

Illustration of the lower bound on the effective identification rate ¯ R eff provided in (14) for target identification rates κ = 0, 0.1, 0.2 and codeword length n = 10 4 . The ISI rate l that yields the maximum value for each value of l is marked by a yellow star and coincides with the optimal lmax provided in (15).

Source publication
Article
Full-text available
Various applications of molecular communications (MCs) feature an alarm-prompt behavior for which the prevalent Shannon capacity may not be the appropriate performance metric. The identification capacity as an alternative measure for such systems has been motivated and established in the literature. In this paper, we study deterministic K-identific...

Contexts in source publication

Context 1
... rate is independent of n, here, the effective identification rate ¯ R eff for the optimal l grows sub-linearly in n. Moreover, the sub-linear increase of the effective rate in n is faster compared to the typical scenario, where T S (and hence L) is fixed and l = 0, and the effective rate, i.e., ((1 − κ) log n)/4 increases logarithmically in n. Fig. 5 shows the lower bound on the effective identification rate ¯ R eff in (14) for target identification rates κ = 0, 0.1, 0.2 and codeword length n = 10 4 . Note that n should be large since our capacity results are valid asymptotically. As expected, each curve in Fig. 5 has a unique maximum at an ISI rate l that coincides with l max in ...
Context 2
... and the effective rate, i.e., ((1 − κ) log n)/4 increases logarithmically in n. Fig. 5 shows the lower bound on the effective identification rate ¯ R eff in (14) for target identification rates κ = 0, 0.1, 0.2 and codeword length n = 10 4 . Note that n should be large since our capacity results are valid asymptotically. As expected, each curve in Fig. 5 has a unique maximum at an ISI rate l that coincides with l max in ...

Citations

... This observation can be interpreted as follows: The channel noise can be exploited as an additional inherent source embedded in the communication setting for performing the K-identification task with a larger value of K. This observation is in contrast to previous results for DKI over the slow fading channel [51], or the DI for Gaussian and Poisson channels [32,48,52], where capacity bounds were shown to be independent of the input constraints or the channel parameters. We demonstrate that the suggested upper and lower bounds on attainable rates (R, κ) are independent of K for constant K, whereas they are functions of the goal identification rate κ for increasing goal message sets. ...
... While the radius of the small balls in the DI problem for the Gaussian channel with slow and fast fading [32], tends to zero as n → ∞, here, the radius similar to the DKI problem for the slow fading channel [51] grows in the codeword length n for asymptotic n. In general, the derivation of lower bound for the BSC is more complicated compared to that for the Gaussian [32] and Poisson channels with/out memory [48,52], and entails exploiting of new analysis and inequalities. Here, the error analysis in the achievability proof requires dealing with several combinatorial arguments and using of bounds on the tail for the cumulative distribution function (CDF) of the Binomial distribution. ...
... Here, the error analysis in the achievability proof requires dealing with several combinatorial arguments and using of bounds on the tail for the cumulative distribution function (CDF) of the Binomial distribution. The DKI problem was recently investigated in [52] for a DTPC with ISI where the size of the ISI taps is assumed to scale as L(n, l) = 2 l log n . In contrast to the findings in [52], where the attainable rate region of triple rates (κ, l, R) for the Poisson channel with memory was derived, here, we study the DKI problem for a memoryless BSC, i.e., L = 1, and the attainable rate region of pair rates (κ, R) is established. ...
Article
Full-text available
Numerous applications of the Internet of Things (IoT) feature an event recognition behavior where the established Shannon capacity is not authorized to be the central performance measure. Instead, the identification capacity for such systems is considered to be an alternative metric, and has been developed in the literature. In this paper, we develop deterministic K-identification (DKI) for the binary symmetric channel (BSC) with and without a Hamming weight constraint imposed on the codewords. This channel may be of use for IoT in the context of smart system technologies, where sophisticated communication models can be reduced to a BSC for the aim of studying basic information theoretical properties. We derive inner and outer bounds on the DKI capacity of the BSC when the size of the goal message set K may grow in the codeword length n. As a major observation, we find that, for deterministic encoding, assuming that K grows exponentially in n, i.e., K=2nκ, where κ is the identification goal rate, then the number of messages that can be accurately identified grows exponentially in n, i.e., 2nR, where R is the DKI coding rate. Furthermore, the established inner and outer bound regions reflects impact of the input constraint (Hamming weight) and the channel statistics, i.e., the cross-over probability.