Article

An introductory review of information theory in the context of computational neuroscience.

Institute for Telecommunications Research, University of South Australia.
Biological Cybernetics (Impact Factor: 1.93). 07/2011; 105(1):55-70. DOI: 10.1007/s00422-011-0451-9
Source: DBLP

ABSTRACT This article introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand its foundations to incorporate more comprehensively biological processes thereby helping reveal how neuronal networks achieve their remarkable information processing abilities.

1 Bookmark
 · 
277 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Contemporary cochlear implants stimulate the auditory nerve with an array of up to 22 electrodes. More electrodes do not typically provide improved hearing performance. Given that this limitation is primarily due to current spread, and that newly developing kinds of electrodes may enable more focused stimulation, we recently proposed an information theoretic modeling framework for estimating how many electrodes might achieve optimal hearing performance under a range of assumptions about electrodes and their placement relative to the nerve. Here, we extend this approach by introducing more realistic three-dimensional spiral geometries for the cochlea and array, and comparing the optimal number of electrodes predicted by our model for this case with that in our original model, which used a linear geometry.
    Conference proceedings: ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference 08/2012; 2012:2965-8.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Although several measurements and analyses are done to support the idea that the brain is energy-optimized, there is one disturbing, contradictory observation: In theory, computation limited by thermal noise can occur as cheaply as ~$2.9\cdot 10^{-21}$ joules per bit (kTln2). Unfortunately, for a neuron, the ostensible discrepancy from this minimum is startling - ignoring inhibition the discrepancy is $10^6$ times this amount and taking inhibition into account $1.4\cdot 10^8$. Here we point out that what has been defined as neural computation is actually a combination of computation and neural communication: the communication costs, transmission from each excitatory postsynaptic activation to the S4-gating-charges of the fast Na+ channels of the initial segment (fNa's), dominate the joule-costs. Making this distinction between communication to the initial segment and computation at the initial segment (i.e., adding up of the activated fNa's) implies that the size of the average synaptic event reaching the fNa's is the size of the standard deviation of the thermal noise, $(kT)^{1/2}$. Moreover, when computation is defined as the addition of activated fNa's, a biophysically plausible mechanism produces the appropriate number of bits for the cost to hit the minimum joules. This mechanism, requiring something like the electrical engineer's equalizer (not much more than the action potential generating conductances), only operates just at or just below threshold. This active filter modifies the last few synaptic excitations, providing barely enough energy to transport the last set of mostly sub-threshold gating charges. That is, the last, threshold-achieving S4-subunit activation requires an energy that matches the information being provided by the last few synaptic events, a ratio that is kTln2 joules per bit.
    08/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: We calculate and analyze the information capacity-achieving conditions and their approximations in a simple neuronal system. The input-output properties of individual neurons are described by an empirical stimulus-response relationship and the metabolic cost of neuronal activity is taken into account. The exact (numerical) results are compared with a popular "low-noise" approximation method which employs the concepts of parameter estimation theory. We show, that the approximate method gives reliable results only in the case of significantly low response variability. By employing specialized numerical procedures we demonstrate, that optimal information transfer can be near-achieved by a number of different input distributions. It implies that the precise structure of the capacity-achieving input is of lesser importance than the value of capacity. Finally, we illustrate on an example that an innocuously looking stimulus-response relationship may lead to a problematic interpretation of the obtained Fisher information values.
    Bio Systems 04/2013; · 1.27 Impact Factor

Full-text (2 Sources)

Download
69 Downloads
Available from
Jun 4, 2014