A theory of rate coding control by intrinsic plasticity effects.

Institut des Systèmes Intelligents et de Robotique, CNRS - UMR 7222, Université Pierre et Marie Curie, Paris, France.
PLoS Computational Biology (Impact Factor: 4.83). 01/2012; 8(1):e1002349. DOI: 10.1371/journal.pcbi.1002349
Source: PubMed

ABSTRACT Intrinsic plasticity (IP) is a ubiquitous activity-dependent process regulating neuronal excitability and a cellular correlate of behavioral learning and neuronal homeostasis. Because IP is induced rapidly and maintained long-term, it likely represents a major determinant of adaptive collective neuronal dynamics. However, assessing the exact impact of IP has remained elusive. Indeed, it is extremely difficult disentangling the complex non-linear interaction between IP effects, by which conductance changes alter neuronal activity, and IP rules, whereby activity modifies conductance via signaling pathways. Moreover, the two major IP effects on firing rate, threshold and gain modulation, remain unknown in their very mechanisms. Here, using extensive simulations and sensitivity analysis of Hodgkin-Huxley models, we show that threshold and gain modulation are accounted for by maximal conductance plasticity of conductance that situate in two separate domains of the parameter space corresponding to sub- and supra-threshold conductance (i.e. activating below or above the spike onset threshold potential). Analyzing equivalent integrate-and-fire models, we provide formal expressions of sensitivities relating to conductance parameters, unraveling unprecedented mechanisms governing IP effects. Our results generalize to the IP of other conductance parameters and allow strong inference for calcium-gated conductance, yielding a general picture that accounts for a large repertoire of experimental observations. The expressions we provide can be combined with IP rules in rate or spiking models, offering a general framework to systematically assess the computational consequences of IP of pharmacologically identified conductance with both fine grain description and mathematical tractability. We provide an example of such IP loop model addressing the important issue of the homeostatic regulation of spontaneous discharge. Because we do not formulate any assumptions on modification rules, the present theory is also relevant to other neural processes involving excitability changes, such as neuromodulation, development, aging and neural disorders.

  • [Show abstract] [Hide abstract]
    ABSTRACT: The intrinsic excitability of neurons is known to be dynamically regulated by activity-dependent plasticity and homeostatic mechanisms. Such processes are commonly analyzed in the context of input-output functions that describe how neurons fire in response to constant levels of current. However, it is not well understood how changes of excitability as observed under static inputs translate to the function of the same neurons in their natural synaptic environment. Here we performed a computational study and hybrid experiments on rat bed nucleus of stria terminalis neurons to compare the two scenarios. The inward rectifying Kir-current (IKir) and the hyperpolarization-activated cation current (Ih) were found to be considerably more effective in regulating the firing under synaptic inputs than under static stimuli. This prediction was experimentally confirmed by dynamic clamp insertion of a synthetic inwardly rectifying Kir-current into the biological neurons. At the same time, ionic currents that activate with depolarization were more effective regulating the firing under static inputs. When two intrinsic currents are concurrently altered such as those under homeostatic regulation, the effects in firing responses under static versus dynamic inputs can be even more contrasting. Our results show that plastic or homeostatic changes of intrinsic membrane currents can shape the current step responses of neurons and their firing under synaptic inputs in a differential manner.
    Journal of Neurophysiology 10/2014; 113(1). DOI:10.1152/jn.00226.2014 · 3.04 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present an unsupervised, local activation-dependent learning rule for intrinsic plasticity (IP) which affects the composition of ion channel conductances for single neurons in a use-dependent way. We use a single-compartment conductance-based model for medium spiny striatal neurons in order to show the effects of parametrization of individual ion channels on the neuronal activation function. We show that parameter changes within the physiological ranges are sufficient to create an ensemble of neurons with significantly different activation functions. We emphasize that the effects of intrinsic neuronal variability on spiking behavior require a distributed mode of synaptic input and can be eliminated by strongly correlated input. We show how variability and adaptivity in ion channel conductances can be utilized to store patterns without an additional contribution by synaptic plasticity (SP). The adaptation of the spike response may result in either "positive" or "negative" pattern learning. However, read-out of stored information depends on a distributed pattern of synaptic activity to let intrinsic variability determine spike response. We briefly discuss the implications of this conditional memory on learning and addiction.
    02/2014; 2(88). DOI:10.12688/f1000research.2-88.v2
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Homeostatic intrinsic plasticity (HIP) is a ubiquitous cellular mechanism regulating neuronal activity, cardinal for the proper functioning of nervous systems. In invertebrates, HIP is critical for orchestrating stereotyped activity patterns. The functional impact of HIP remains more obscure in vertebrate networks, where higher order cognitive processes rely on complex neural dynamics. The hypothesis has emerged that HIP might control the complexity of activity dynamics in recurrent networks, with important computational consequences. However, conflicting results about the causal relationships between cellular HIP, network dynamics, and computational performance have arisen from machine-learning studies. Here, we assess how cellular HIP effects translate into collective dynamics and computational properties in biological recurrent networks. We develop a realistic multiscale model including a generic HIP rule regulating the neuronal threshold with actual molecular signaling pathways kinetics, Dale's principle, sparse connectivity, synaptic balance, and Hebbian synaptic plasticity (SP). Dynamic mean-field analysis and simulations unravel that HIP sets a working point at which inputs are transduced by large derivative ranges of the transfer function. This cellular mechanism ensures increased network dynamics complexity, robust balance with SP at the edge of chaos, and improved input separability. Although critically dependent upon balanced excitatory and inhibitory drives, these effects display striking robustness to changes in network architecture, learning rates, and input features. Thus, the mechanism we unveil might represent a ubiquitous cellular basis for complex dynamics in neural networks. Understanding this robustness is an important challenge to unraveling principles underlying self-organization around criticality in biological recurrent neural networks.
    The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 09/2013; 33(38):15032-43. DOI:10.1523/JNEUROSCI.0870-13.2013 · 6.75 Impact Factor