Preprint

Identification via Functions

Authors:
Preprints and early-stage research may not have been peer reviewed yet.
To read the file of this research, you can request a copy directly from the authors.

Abstract

We develop a framework for studying the problem of identifying roots of a noisy function. We revisit a previous logarithmic bound on the number of observations and propose a general problem for identification of roots with three errors. As a key finding, we establish a novel logarithmic lower bound on the number of observations which outperforms the previous result across certain regimes of error and accuracy of the identification test. Furthermore, we recover the previous results for root identification as a special case and draw a connection to the message identification problem of Ahlswede.

No file available

Request Full-text Paper PDF

To read the file of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Numerous applications of the Internet of Things (IoT) feature an event recognition behavior where the established Shannon capacity is not authorized to be the central performance measure. Instead, the identification capacity for such systems is considered to be an alternative metric, and has been developed in the literature. In this paper, we develop deterministic K-identification (DKI) for the binary symmetric channel (BSC) with and without a Hamming weight constraint imposed on the codewords. This channel may be of use for IoT in the context of smart system technologies, where sophisticated communication models can be reduced to a BSC for the aim of studying basic information theoretical properties. We derive inner and outer bounds on the DKI capacity of the BSC when the size of the goal message set K may grow in the codeword length n. As a major observation, we find that, for deterministic encoding, assuming that K grows exponentially in n, i.e., K=2nκ, where κ is the identification goal rate, then the number of messages that can be accurately identified grows exponentially in n, i.e., 2nR, where R is the DKI coding rate. Furthermore, the established inner and outer bound regions reflects impact of the input constraint (Hamming weight) and the channel statistics, i.e., the cross-over probability.
Article
Various applications of molecular communications (MC) are event-triggered, and, as a consequence, the prevalent Shannon capacity may not be the right measure for performance assessment. Thus, in this paper, we motivate and establish the identification capacity as an alternative metric. In particular, we study deterministic identification (DI) for the discrete-time Poisson channel (DTPC), subject to an average and a peak molecule release rate constraint, which serves as a model for MC systems employing molecule counting receivers. It is established that the number of different messages that can be reliably identified for this channel scales as 2(nlogn)R, where n and R are the codeword length and coding rate, respectively. Lower and upper bounds on the DI capacity of the DTPC are developed. The obtained large capacity of the DI channel sheds light on the performance of natural DI systems such as natural olfaction, which are known for their extremely large chemical discriminatory power in biology. Furthermore, numerical results for the empirical miss-identification and false identification error rates are provided for finite length codes. This allows us to characterize the behaviour of the error rate for increasing codeword lengths, which complements our theoretically-derived scale for asymptotically large codeword lengths.
Article
Let M(x) denote the expected value at level x of the response to a certain experiment. M(x) is assumed to be a monotone function of x but is unknown to the experimenter, and it is desired to find the solution x=θx = \theta of the equation M(x)=αM(x) = \alpha, where α\alpha is a given constant. We give a method for making successive experiments at levels x1,x2,x_1,x_2,\cdots in such a way that xnx_n will tend to θ\theta in probability.
Article
Let M(x) be a regression function which has a maximum at the unknown point θ.M(x)\theta. M(x) is itself unknown to the statistician who, however, can take observations at any level x. This paper gives a scheme whereby, starting from an arbitrary point x1x_1, one obtains successively x2,x3,x_2, x_3, \cdots such that xnx_n converges to θ\theta in probability as nn \rightarrow \infty.
Article
The authors' main finding is that any object among doubly exponentially many objects can be identified in blocklength n with arbitrarily small error probability via a discrete memoryless channel (DMC), if randomization can be used for the encoding procedure. A novel doubly exponential coding theorem is presented which determines the optimal R, that is, the identification capacity of the DMC as a function of its transmission probability matrix. This identification capacity is a well-known quantity, namely, Shannon's transmission capacity for the DMC.
Article
A study is made of the identification problem in the presence of a noiseless feedback channel, and the second-order capacity C f (resp. C F) for deterministic (resp. randomized) encoding strategies is determined. Several important phenomena are encountered. (1) Although feedback does not increase the transmission capacity of a discrete memoryless channel (DMC), it does increase the (second-order) identification capacity; (2) noise increases C f; (3) the structure of the new capacity formulas is simpler than C.E. Shannon's (1948) familiar formula. This has the effect that proofs of converses become easier than in the authors' previous work
Identification of Messages and Identifying Zeroes of a Function
  • C Kleinwächter
C. Kleinwächter, "Identification of Messages and Identifying Zeroes of a Function," in Int. Symp. Inf. Theory, 1997, pp. 156-.