Publications (10)13.45 Total impact
 [Show abstract] [Hide abstract]
ABSTRACT: At very short timescales neuronal spike trains may be compared to binary streams where each neuron gives at most one spike per bin and therefore its state can be described by a binary variable. Timeaveraged activity like the mean firing rate can be generally used on longer timescales to describe the dynamics; nevertheless, enlarging the space of the possible states up to the continuum may seriously bias the true statistics if the sampling is not accurate. We propose a simple transformation on binary variables which allows us to fix the dimensionality of the space to sample and to vary the temporal resolution of the analysis. For each time length interactions among simultaneously recorded neurons are evaluated using loglinear models. We illustrate how to use this method by analysing two different sets of data, recorded respectively in the temporal cortex of freely moving rats and in the inferotemporal cortex of behaving monkeys engaged in a visual fixation task. A detailed study of the interactions is provided for both samples. In both datasets we find that some assemblies share robust interactions, invariant at different time lengths, while others cooperate only at delimited time resolutions, yet the size of the samples is too small to allow an unbiased estimate of all possible interactions. We conclude that an extensive application of our method to larger samples of data, together with the development of techniques to correct the bias in the estimate of the coefficients, would provide significant information about the structure of the interactions in populations of neurons.Network Computation in Neural Systems 03/2004; 15(1):1328. · 0.33 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: Experimental evidence suggests that spike timing might be used by neurons to process and store information. Unfortunately, the mathematical analysis of recurrent networks with spiking neurons is highly non trivial. Most analytical studies have therefore focused on ratebased models, whereas spiking models tend to be studied numerically. In order to bridge this gap, we propose an effective spiking neuron model which still allows for the application of nonequilibrium statistical mechanical techniques. The model is flexible and its parameters can be adjusted in order to match real data. We analyze the population dynamics in the simple case of constant excitatory synapses.Neurocomputing. 01/2004; 5860:239244.  [Show abstract] [Hide abstract]
ABSTRACT: Recent studies have explored theoretically the ability of populations of neurons to carry information about a set of stimuli, both in the case of purely discrete or purely continuous stimuli, and in the case of multidimensional continuous angular and discrete correlates, in the presence of additional quenched disorder in the distribution. An analytical expression for the mutual information has been obtained in the limit of large noise by means of the replica trick. Here, we show that the same results can actually be obtained in most cases without the use of replicas, by means of a much simpler expansion of the logarithm. Fitting the theoretical model to real neuronal data, we show that the introduction of correlations in the quenched disorder improves the fit, suggesting a possible role of signal correlationsactually detected in real datain a redundant code. We show that even in the more difficult analysis of the asymptotic regime, an explicit expression for the mutual information can be obtained without resorting to the replica trick despite the presence of quenched disorder, both with a Gaussian and with a more realistic thresholdedGaussian model. When the stimuli are mixed continuous and discrete, we find that with both models the information seem to grow logarithmically to infinity with the number of neurons and with the inverse of the noise, even though the exact general dependence cannot be derived explicitly for the thresholdedGaussian model. In the large noise limit, lower values of information were obtained with the thresholdedGaussian model, for a fixed value of the noise and of the population size. On the contrary, in the asymptotic regime, with very low values of the noise, a lower information value is obtained with the Gaussian model.Physical Review E 10/2003; 68(3 Pt 1):031906. · 2.31 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: We present four `case study' examples of solvable problems in the theory of recurrent neural networks, which are relevant to our understanding of information processing in the brain, but which are also interesting from a purely statistical mechanical point of view, even at the level of simple models (which helps in stimulating interdisciplinary work). The examples concern issues in network dynamics, network connectivity, spike timing and synaptic plasticity.03/2003; 
Article: Statistical mechanics beyond the Hopfield model: solvable problems in neural network theory.
[Show abstract] [Hide abstract]
ABSTRACT: We present four 'case study' examples of solvable problems in the theory of recurrent neural networks, which are relevant to our understanding of information processing in the brain, but which are also interesting from a purely statistical mechanical point of view, even at the level of simple models (which helps in stimulating interdisciplinary work). The examples concern issues in network dynamics, network connectivity, spike timing and synaptic plasticity.Reviews in the neurosciences 02/2003; 14(12):18193. · 3.26 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: In a previous paper we have evaluated analytically the mutual information between the firing rates of N independent units and a set of multidimensional continuous and discrete stimuli, for a finite population size and in the limit of large noise. Here, we extend the analysis to the case of two interconnected populations, where input units activate output ones via Gaussian weights and a threshold linear transfer function. We evaluate the information carried by a population of M output units, again about continuous and discrete correlates. The mutual information is evaluated solving saddlepoint equations under the assumption of replica symmetry, a method that, by taking into account only the term linear in N of the input information, is equivalent to assuming the noise to be large. Within this limitation, we analyze the dependence of the information on the ratio M/N, on the selectivity of the input units and on the level of the output noise. We show analytically, and confirm numerically, that in the limit of a linear transfer function and of a small ratio between output and input noise, the output information approaches asymptotically the information carried in input. Finally, we show that the information loss in output does not depend much on the structure of the stimulus, whether purely continuous, purely discrete or mixed, but only on the position of the threshold nonlinearity, and on the ratio between input and output noise.Physical Review E 05/2002; 65(4 Pt 1):041918. · 2.31 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: In a recent work we have introduced a novel approach to study the effect of weak nonlinearity in the transfer function on the information transmitted by an analogue channel, by means of a perturbative diagrammatic expansion. We extend here the analysis to all orders in perturbation theory, which allows us to release any constraint concerning the magnitude of the expansion parameter and to establish the rules to calculate easily the contribution at any order. As an example we explicitly compute the information up to the second order in nonlinearity, in presence of random Gaussian connectivity and in the limit when the output noise is not small. We analyze the first and second order contributions to the mutual information as a function of the nonlinearity and as a function of the number of output units. We believe that an extensive application of our method via the analysis of the different contributions at distinct orders might be able to fill a gap between wellknown analytical results obtained for linear channels and nontrivial treatments which are required to study highly nonlinear channels.International Journal of Modern Physics B 01/2002; 16:35273543. · 0.46 Impact Factor  [Show abstract] [Hide abstract]
ABSTRACT: In a recent study, the initial rise of the mutual information between the firing rates of N neurons and a set of p discrete stimuli has been analytically evaluated, under the assumption that neurons fire independently of one another to each stimulus and that each conditional distribution of firing rates is Gaussian. Yet real stimuli or behavioral correlates are high dimensional, with both discrete and continuously varying features. Moreover, the Gaussian approximation implies negative firing rates, which is biologically implausible. Here, we generalize the analysis to the case where the stimulus or behavioral correlate has both a discrete and a continuous dimension, like orientation and shape could be in a visual stimulus, or type and direction in a motor action. The functional relationship between the firing patterns and the continuous correlate is expressed through the tuning curve of the neuron, using two different parameters to modulate its width and its flatness. In the case of large noise, we evaluate the mutual information up to the quadratic approximation as a function of population size. We also show that in the limit of large N and assuming that neurons can discriminate between continuous values with a resolution Delta(theta), the mutual information grows to infinity like ln(1/Delta(theta)) when Delta(theta) goes to zero. Then we consider a more realistic distribution of firing rates, truncated at zero, and we prove that the resulting correction, with respect to the Gaussian firing rates, can be expressed simply as a renormalization of the noise parameter. Finally, we demonstrate the effect of averaging the distribution across the discrete dimension, evaluating the mutual information only with respect to the continuously varying correlate.Physical Review E 09/2001; 64(2 Pt 1):021912. · 2.31 Impact Factor 
Article: How much do they tell us to move?
Neurocomputing 06/2001; 3840:11811184. · 2.01 Impact Factor 
Article: A Perturbative Approach to Nonlinearities in the Information Carried by a Two Layer Neural Network
[Show abstract] [Hide abstract]
ABSTRACT: We evaluate the mutual information between the input and the output of a two layer network in the case of a noisy and nonlinear analogue channel. In the case where the nonlinearity is small with respect to the variability in the noise, we derive an exact expression for the contribution to the mutual information given by the nonlinear term in first order of perturbation theory. Finally we show how the calculation can be simplified by means of a diagrammatic expansion. Our results suggest that the use of perturbation theories applied to neural systems might give an insight on the contribution of nonlinearities to the information transmission and in general to the neuronal dynamics.International Journal of Modern Physics B 01/2001; 15:281295. · 0.46 Impact Factor
Publication Stats
21  Citations  
13.45  Total Impact Points  
Top Journals
Institutions

2003–2004

King's College London
 Department of Mathematics
London, ENG, United Kingdom


2001

Scuola Internazionale Superiore di Studi Avanzati di Trieste
Trst, Friuli Venezia Giulia, Italy
