Stimulus-dependent suppression of chaos in recurrent neural networks

Lewis-Sigler Institute for Integrative Genomics, Icahn 262, Princeton University, Princeton, New Jersey 08544, USA.
Physical Review E (Impact Factor: 2.29). 07/2010; 82(1 Pt 1):011903. DOI: 10.1103/PHYSREVE.82.011903
Source: PubMed


Neuronal activity arises from an interaction between ongoing firing generated spontaneously by neural circuits and responses driven by external stimuli. Using mean-field analysis, we ask how a neural network that intrinsically generates chaotic patterns of activity can remain sensitive to extrinsic input. We find that inputs not only drive network responses, but they also actively suppress ongoing activity, ultimately leading to a phase transition in which chaos is completely eliminated. The critical input intensity at the phase transition is a nonmonotonic function of stimulus frequency, revealing a "resonant" frequency at which the input is most effective at suppressing chaos even though the power spectrum of the spontaneous activity peaks at zero and falls exponentially. A prediction of our analysis is that the variance of neural responses should be most strongly suppressed at frequencies matching the range over which many sensory systems operate.

Download full-text


Available from: Kanaka Rajan,
  • Source
    • "The network nevertheless shows roughly asynchronous spiking behavior and is a reasonable candidate model for rich spontaneous cortical dynamics. We implemented this spiking network in an effort to test the alternative theory for stimulus induced reduction in spike count variability proposed by Rajan et al. (2010). "
    [Show abstract] [Hide abstract]
    ABSTRACT: A signature feature of cortical spike trains is their trial-to-trial variability. This variability is large in the spontaneous state and is reduced when cortex is driven by a stimulus or task. Models of recurrent cortical networks with unstructured, yet balanced, excitation and inhibition generate variability consistent with evoked conditions. However, these models produce spike trains which lack the long timescale fluctuations and large variability exhibited during spontaneous cortical dynamics. We propose that global network architectures which support a large number of stable states (attractor networks) allow balanced networks to capture key features of neural variability in both spontaneous and evoked conditions. We illustrate this using balanced spiking networks with clustered assembly, feedforward chain, and ring structures. By assuming that global network structure is related to stimulus preference, we show that signal correlations are related to the magnitude of correlations in the spontaneous state. Finally, we contrast the impact of stimulation on the trial-to-trial variability in attractor networks with that of strongly coupled spiking networks with chaotic firing rate instabilities, recently investigated by Ostojic (2014). We find that only attractor networks replicate an experimentally observed stimulus-induced quenching of trial-to-trial variability. In total, the comparison of the trial-variable dynamics of single neurons or neuron pairs during spontaneous and evoked activity can be a window into the global structure of balanced cortical networks.
    Frontiers in Computational Neuroscience 05/2014; 8:56. DOI:10.3389/fncom.2014.00056 · 2.20 Impact Factor
  • Source
    • "II. MEAN-FIELD THEORY FOR LEAKY ESN ON SPARSE REGULAR GRAPHS A. Mean-field equations From the seminal work [18], recently extended to the framework of stimulus driven RNN [20], [21], one can derive a self-constistent equation describing the statistical properties of the reservoir activity in the large n limit, which is known as the mean-field theory. In this section, we present a straightforward extension of [21] to leaky RNNs on regular graphs. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Echo State Networks are efficient time series predictors which highly depend on the value of the spectral radius of the reservoir connections. Based on recent results on the mean field theory of driven random recurrent neural networks, which allow the computation of the largest Lyapunov exponent of an ESN, we develop a cheap algorithm to establish a local and operational version of the Echo State Property which garantees good prediction performances. The value of the spectral radius tuning the network to the edge of chaos is specific to the considered input and is larger than 1.
  • Source
    • "Theoretical analysis of cortical models and cortical data have led to the concept that the operating point of the cortex is a balanced state in which synaptic excitatory and inhibitory currents across the cell membranes of cortical neurons are approximately equal (Brunel & Wang, 2003; Hamaguchi, Riehle, & Brunel, 2011; Hansel & van Vreeswijk, 2012; Rajan, Abbott, & Sompolinsky, 2010; Shelley, McLaughlin, Shapley, & Wielaard, 2002; van Vreeswijk & Sompolinsky, 1996). There is a growing body of experimental evidence in "
    [Show abstract] [Hide abstract]
    ABSTRACT: Theoretical considerations have led to the concept that the cerebral cortex is operating in a balanced state in which synaptic excitation is approximately balanced by synaptic inhibition from the local cortical circuit. This paper is about the functional consequences of the balanced state in sensory cortex. One consequence is gain control: there is experimental evidence and theoretical support for the idea that local circuit inhibition acts as a local automatic gain control throughout the cortex. Second, inhibition increases cortical feature selectivity: many studies of different sensory cortical areas have reported that suppressive mechanisms contribute to feature selectivity. Synaptic inhibition from the local microcircuit should be untuned (or broadly tuned) for stimulus features because of the microarchitecture of the cortical microcircuit. Untuned inhibition probably is the source of Untuned Suppression that enhances feature selectivity. We studied inhibition's function in our experiments, guided by a neuronal network model, on orientation selectivity in the primary visual cortex, V1, of the Macaque monkey. Our results revealed that Untuned Suppression, generated by local circuit inhibition, is crucial for the generation of highly orientation-selective cells in V1 cortex.
    Neural networks: the official journal of the International Neural Network Society 09/2012; 37. DOI:10.1016/j.neunet.2012.09.005 · 2.71 Impact Factor
Show more