PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Noise is an inherent part of neuronal dynamics, and thus of the brain. It can be observed in neuronal activity at different spatiotemporal scales, including in neuronal membrane potentials, local field potentials, electroencephalography, and magnetoencephalography. A central research topic in contemporary neuroscience is to elucidate the functional role of noise in neuronal information processing. Experimental studies have shown that a suitable level of noise may enhance the detection of weak neuronal signals by means of stochastic resonance. In response, theoretical research, based on the theory of stochastic processes, nonlinear dynamics, and statistical physics, has made great strides in elucidating the mechanism and the many benefits of stochastic resonance in neuronal systems. In this perspective, we review recent research dedicated to neuronal stochastic resonance in biophysical mathematical models. We also explore the regulation of neuronal stochastic resonance, and we outline important open questions and directions for future research. A deeper understanding of neuronal stochastic resonance may afford us new insights into the highly impressive information processing in the brain.
Content may be subject to copyright.
epl draft
Functional importance of noise in neuronal information processing
Daqing Guo1, Matjaˇ
z Perc2,3, Tiejun Liu1, Dezhong Yao1
1The Clinical Hospital of Chengdu Brain Science Institute, MOE Key Lab for Neuroinformation, School of Life
Science and Technology, University of Electronic Science and Technology of China, Chengdu 610054, China
2Faculty of Natural Sciences and Mathematics, University of Maribor, Koroˇska cesta 160, SI-2000 Maribor, Slovenia
3Complexity Science Hub Vienna, Josefst¨adterstraße 39, A-1080 Vienna, Austria
PACS 05.45.-a – Nonlinear dynamics and chaos
PACS 87.19.L- – Neuroscience
PACS 87.17.-d – Cell processes
Abstract –Noise is an inherent part of neuronal dynamics, and thus of the brain. It can be
observed in neuronal activity at different spatiotemporal scales, including in neuronal membrane
potentials, local field potentials, electroencephalography, and magnetoencephalography. A central
research topic in contemporary neuroscience is to elucidate the functional role of noise in neu-
ronal information processing. Experimental studies have shown that a suitable level of noise may
enhance the detection of weak neuronal signals by means of stochastic resonance. In response,
theoretical research, based on the theory of stochastic processes, nonlinear dynamics, and sta-
tistical physics, has made great strides in elucidating the mechanism and the many benefits of
stochastic resonance in neuronal systems. In this perspective, we review recent research dedicated
to neuronal stochastic resonance in biophysical mathematical models. We also explore the regu-
lation of neuronal stochastic resonance, and we outline important open questions and directions
for future research. A deeper understanding of neuronal stochastic resonance may afford us new
insights into the highly impressive information processing in the brain.
Introduction. – The human brain is composed
of nearly 100 billion neurons that conduct signals via
synapses [1]. The massive number of neurons is believed
to be the fundament behind the computational power of
the brain, which integrates and generates neuronal in-
formation in the form of action potentials, or so-called
spikes [1,2]. Deeply understanding the mechanisms of sig-
nal processing amongst neurons promises to unlock the se-
crets behind the computational principles of the real brain.
Neuronal activity recorded in electrophysiological stud-
ies typically exhibits a certain level of stochastic fluc-
tuations [3, 4]. This stochastic feature can be observed
even when our brain is at rest, suggesting that biological
neurons might operate in noisy environments. To date,
many types of noise sources have been identified in the
brain, such as thermal noise [5], ion channel noise [6–10],
synaptic release noise [11–16], and synaptic bombardment
noise [17–20]. Among them, the stochastic opening and
closing of ion channels and massive amounts of synaptic
bombardment are thought to be two main noise sources
in neural systems. Because the firing dynamics of neu-
rons can be largely impacted and disrupted by a stochas-
tic drive, a natural question arises regarding whether neu-
ronal noise always acts as a destructor in the brain. How-
ever, accumulating evidence has indicated that neuronal
noise might promote signal processing of neurons under
proper conditions [4, 21–24].
Perhaps the most prominent noise-induced counterintu-
itive behavior discovered in the past 40 years is stochastic
resonance (SR) [25–29]. By definition, SR initially refers
to a nonlinear system driven by a periodic signal show-
ing the maximal information transmission at an interme-
diate noise level [25–29]. Neurons in the brain are not
only driven by stochastic fluctuations from various noise
sources but also receive rhythmic signals due to neuronal
oscillations [1, 2]. Furthermore, biological neurons exhibit
different types of firing excitability and show highly non-
linear responses to external stimuli [30]. These features
enable the SR to serve as an underlying dynamical mech-
anism that real neurons utilize to improve information
processing, an idea supported by increasing experimen-
tal data. For instance, Douglass et al. showed that ap-
preciable level of external noise can enhance the infor-
mation transfer in mechanoreceptive sensory neurons of
arXiv:1812.09897v2 [q-bio.NC] 28 Dec 2018
Guo et al.
crayfish, indicating the occurrence of SR in neural sys-
tems [31]. Similar neuronal SR phenomena have been
widely observed in sensory systems of other species, such
as crickets [32], rats [33] and paddlefish [34]. Later inves-
tigations have also demonstrated SR in neurons from the
mammalian brain [35, 36].
Recent studies using computational modeling have pro-
vided more evidence of the benefits of SR occurring in neu-
ral systems [23,24]. Specifically, it has been demonstrated
that different types of SR may arise in neural systems and
can be regulated by several intrinsic and external proper-
ties. In this mini review, we briefly summarize and discuss
computational results on the functional roles of noise in
enhancing neuronal information processing via SR-related
mechanisms and propose several urgent questions in this
research field.
Models and measurements for neuronal SR stud-
ies. – Classical neuronal SR studies focus on a neuron
driven by both a subthreshold periodic signal and stochas-
tic noise (Fig. 1(a)) [37–39]. As a paradigmatic model, the
Hodgkin-Huxley (HH) neuron has served as the preferred
model for simulating neuronal dynamics [2, 40]. In classi-
cal neuronal SR studies, the current balance equation of
the HH neuron can be written as:
dt =INa IKIL+Idriving +Inoise,(1)
where Vrepresents the membrane potential and Cis the
membrane capacitance per unit area. The sodium, potas-
sium, and leakage currents through the membrane are
modeled as INa =GNam3h(VENa), IK=GKn4(VEK)
and IL=GL(VEL), respectively. The model parameters
GNa,GKand GLare the maximal sodium, potassium, and
leakage conductances per unit area, and ENa,EKand EL
denote the corresponding reversal potentials. Three gat-
ing variables obey the following Langevin equation [2,40]:
dt =αx(V)(1 x)βx(V)x, (2)
where x=m, h, n, and αx(V) and βx(V) are six voltage-
dependent rate functions given in [2,40]. The subthreshold
periodic driving can be mimicked by a sinusoidal signal,
Idriving =Asin(2πfst), with Aand fsrepresenting sig-
nal amplitude and forcing frequency, respectively. The
noise current is typically modeled as Inoise =I0+(t),
where I0denotes the bias current, ξ(t) is the Gaussian
white noise with zero mean and unit variance, and Dis
a parameter controlling the noise intensity. To a certain
extent, this type of noise current can be used to simulate
membrane fluctuations due to massive synaptic bombard-
ment [18–20]. Note that, as another important type of
neuronal noise, the ion channel noise can be modeled by
adding stochastic terms in the Langevin equations for gat-
ing variables [8–10]:
dt =αx(V)(1 x)βx(V)x+ξx(t),(3)
Fig. 1: (Color online) Noise-induced SR in neural systems.
(a) Framework for classical neuronal SR studies. A neuron is
simultaneously driven by periodic inputs and stochastic noise.
The neuron optimally responds to the periodic signal at an
intermediate noise level. (b) Dependence of the SNR on the
noise intensity D. A bell-shaped SNR curve is seen by varying
D, indicating the occurrence of SR. At the same noise level, a
relatively stronger subthreshold periodic driving signal evokes
a higher SNR value. (c) Dependence of SNR on the input
frequency fs. For different membrane capacitances, the neuron
exhibits distinct frequency sensitivity ranges. Data depicted in
(b) and (c) are based on simulations using the HH model.
where ξx(t) (x=m, h, n ) are independent Gaussian white
noises with zero mean. For different gating variables, the
noise autocorrelation functions depend on the stochastic
membrane potential and the total number of ion channels
controlled by both the densities of ion channels and the
area of the membrane patch, which has been described in
previous modeling studies in detail [8–10].
The information transfer capability for neurons driven
by external periodic inputs can be quantified by several
measurements. Similar to SR in other dynamical systems,
the most frequently used measure is the signal-to-noise
ratio (SNR) at the input frequency fs[24]. To compute
the SNR, the power spectral density of the spike train is
estimated using a fast Fourier transform. Mathematically,
the SNR is defined as [20, 38]:
SNR = S(fs)N(fs)
where S(fs) is the power at the input frequency fs, and
N(fs) is the mean background power at nearby frequen-
cies. As a signature of neuronal SR, it can be seen that the
SNR curve exhibits a bell-shaped profile with increasing
noise intensity (Fig. 1(b)). On the other hand, the second
measure widely employed in neuronal SR studies is the
Fourier coefficient [41, 42]. This measurement has been
Functional importance of noise in neuronal information processing
confirmed to be proportional to the square of the spectral
power amplification and can be calculated as [43]:
sin +Q2
where Qsin =1
nT RnT
02V(t) sin(2πt/T )dt and Qcos =
nT RnT
02V(t) cos(2πt/T )dt. Here, T= 1/fsis the pe-
riod of the external driving signal, and nis a positive
integer related to the simulation time. Similar to SNR,
a larger Qvalue implies a better information transfer ca-
pability. When SR behavior occurs in neural systems, a
peak value can be seen in the Qcurve as a function of
noise intensity. Note that several other measurements,
such as power norms [44–46] and information-based mea-
sures [47–49], can also be used to characterize the perfor-
mance of SR.
Stochastic resonance in neural systems. – Com-
putational models offer an efficient approach to investigate
neuronal SR. Following experimental observations, recent
theoretical explorations have suggested that distinct types
of SR may be evoked in neural systems under different cir-
cumstances. In this section, we briefly summarize several
typical neuronal SR behaviors observed in modeling stud-
As described above, the classical neuronal SR frame-
work mainly considers a neuron or a population of neu-
rons driven by both stochastic noise and the subthresh-
old periodic force with a single frequency component [37–
39]. By computational modeling, the classical SR has
been demonstrated to appear in spiking model neurons
with either threshold-spiking or resonance-spiking mecha-
nisms [20,26, 38, 50]. These two generation mechanisms of
spikes involve firing dynamics for most biological neurons
and correspond to class I and II neuronal excitability [30],
respectively. As shown in Fig. 1(b), the performance of
classical neuronal SR is impacted by the amplitude of the
external periodic signal. A relatively stronger subthresh-
old periodic driving signal tends to evoke a higher peak
SNR. Furthermore, this amplitude-based neuronal SR is
also modulated by the input frequency of the external driv-
ing signal (Fig. 1(c)). It has been found that neurons
show optimal responses to the external periodic signal at
a special frequency range, implying the existence of fre-
quency sensitivity. Similar neuronal frequency sensitivity
has been widely reported in past experimental and compu-
tational studies [20,51–53], which can be attributed to the
cooperation of the intrinsic neuronal oscillations and the
periodic input signals. This observation is of particular
interest because oscillatory signals in the brain cover mul-
tiple frequency bands [1], and neurons with different in-
trinsic spiking dynamics might preferentially process weak
neural information with distinct frequency bands via SR
mechanism (see Fig. 1(c)).
On the other hand, biological neurons may simultane-
ously receive more complicated oscillatory signals from dif-
ferent brain regions with mixed-frequency features. Thus,
Fig. 2: (Color online) Noise-induced frequency-difference-
dependent SR (FDDSR) in neural systems. (a) Schematic pre-
sentation of FDDSR studies. A neuron is driven by stochastic
noise and two periodic inputs with frequencies f1and f2. (b)
Dependence of the SNR on noise intensity Dat different beat
frequencies f0=|f1f2|. (c) Maximal SNR versus f0. The
neuron shows a better performance at a relatively lower beat
frequency. (d) Dependence of the SNR on driving frequencies
f1and f2. The neuron exhibits a wider frequency-sensitivity
range at an intermediate noise level. The data presented here
are adapted from our previous study [57].
a naturally arising question is whether similar SR phe-
nomena can also be discovered in neural systems with
multiple periodic input components. As pioneers in this
field, Chialvo et al. confirmed that a single neuron driven
by mixed periodic signals with harmonic frequencies of
a fundamental frequency showed the maximal response
to the fundamental frequency at an intermediate noise
level [54–56]. They labeled this seemingly SR-like ef-
fect as the “ghost SR” (GSR), implying that it appears
at the fundamental frequency missing in the input sig-
nals [54–56]. Recently, we extended the theory of ghost
SR by investigating the response of neural systems to
two subthreshold periodic signals with an arbitrary dif-
ference in frequency (Fig. 2(a)) [57]. Through computa-
tional modeling, we demonstrated that SR might occur at
the beat frequency in neural systems at both the single-
neuron and population levels (Fig. 2(b)). Similarly to the
classical SR in neural systems [51–53], the performance
of this frequency-difference-dependent SR exhibits a fre-
quency sensitivity feature, and a smaller beat frequency
corresponds to a stronger SNR at the optimal noise level
(Figs. 2(c) and 2(d)). Furthermore, our simulations indi-
cated that the population response of neural ensembles is
more efficient than that of a single neuron to detect neu-
ral information carried by the superposition of multiple
periodic signals at the beat frequency. These results high-
Guo et al.
Fig. 3: (Color online) Aperiodic SR and suprathreshold SR
in neural systems. (a) Schematic presentation of a summing
network consisting of Nneurons. Each neuron is driven by a
common aperiodic signal and independent stochastic noise. (b)
Dependence of the normalized power norm on noise intensity.
(c) Dependence of the information rate on noise intensity. Ape-
riodic SR appears for a weak common driving signal, whereas
suprathreshold SR can be observed at the population level for
a strong common driving signal. The data presented here are
adapted from previous studies [44, 48].
light the functional roles of stochastic noise in enhancing
the signal transduction for beat-frequency-related neural
information and may have important biological applica-
Although neuronal SR studies have commonly assumed
weak periodic external forces, it should be noted that the
SR-type behaviors in neural systems are not limited to
signals with periodic components and signals below the
threshold level (Fig. 3(a)). Using a variety of spiking
model neurons, Collins et al. developed a theory that
the SR effect can be detected with the correlation-based
power norm when a neuron is subjected to weak aperi-
odic signals and white noise (Fig. 3(b)) [44–46]. Inspired
by the aperiodic property of the external driving signal,
this phenomenon was designated as aperiodic SR [44–46].
Compared with a single neuron, the detection capability of
a weak signal via aperiodic SR can be further improved in
a summing network consisting of a population of neurons.
This theoretical prediction has been demonstrated by well-
designed biological experiments, showing that aperiodic
SR indeed exists in mammalian sensory neurons [46]. Us-
ing information-based measurements, numerical simula-
tions have shown that SR-type behavior can also be ob-
served in a summing neuronal network even when the ex-
ternal driving signal is sufficiently strong and above the
threshold of neurons (Fig. 3(c)) [47–49]. However, the bell-
shaped measurement curve disappears for a single neuron
(see N= 1 in Fig. 3(c)), indicating that the suprathresh-
old neuronal SR may be only observed at the population
level. Further experiments need to be designed and per-
formed to verify whether suprathreshold SR indeed exists
in the real brain.
We emphasize that many other types of SR-related be-
haviors may also occur in neural systems, and several
of them are described as follows. First, although previ-
ous modeling studies have shown that neurons operating
as coincidence detectors (class III excitability) may not
show classical SR behaviors, this type of neuron has been
identified as sensitive to changes in the stimulus [58]. In
the presence of noise, this dynamic feature causes class
III neurons to exhibit slope-based SR behavior, a phe-
nomenon supported by both experimental and computa-
tional evidence [58, 59]. Second, even without an exter-
nal driving signal, neurons perturbed solely by noise may
also display so-called internal or autonomous SR behav-
ior [60]. This phenomenon is highly associated with co-
herence resonance in excitable systems [61], and a noise-
induced peak can be seen clearly in the power spectrum.
As the noise intensity is increased, the SNR value first
drops and then rises, and the internal SR appears at an
intermediate level of neuronal noise. This phenomenon
has not only been observed in theoretical studies but has
also been confirmed in recent biological experiments [62].
Third, a similar enhancement effect of stochastic noise can
also be induced by a high-frequency signal via vibrational
resonance (VR) [63]. As an important variant of SR, the
phenomenon of VR has been observed in many neural sys-
tems, in which the neuronal response to a low-frequency
external driving signal is improved at the optimal ampli-
tude of the high-frequency signal [64, 65]. This finding
is of biological importance because high-frequency neural
oscillations have been widely observed in the brain and
have been linked to many higher brain functions [66]. Fi-
nally, many nonlinear systems stimulated by subthreshold
periodic signals have been found to be enhanced at more
than one noise level [67,68]. Note that this specific type of
SR is referred to as stochastic multi-resonance or multiple
SR, which has also been widely reported in recent neural
modeling studies [69–71].
Overall, these observations provide both the computa-
tional evidence and theoretical basis for the occurrence
of SR in neural systems. After long-term evolution, our
brain might have the ability to exploit SR by optimizing
stochastic noise from different sources to facilitate neu-
ronal information processing.
Regulation of SR in neural systems. – What are
the regulatory mechanisms of neuronal SR in the brain?
Theoretically, there are several underlying intrinsic and
external biological factors that can achieve this function,
and we discuss this issue from different levels in the fol-
lowing section.
At the single-neuron level, both the weak signal detec-
tion capability and the frequency sensitivity range of neu-
rons highly depend on the single neuron firing properties
(Fig. 4(a)). Experimental studies have established that
intrinsic differences in neuronal morphology, the distribu-
Functional importance of noise in neuronal information processing
tion of ion channels, ionic concentrations, specific mem-
brane properties and threshold diversity might result in
the distinctive firing properties of neurons [72–76]. In ad-
dition, several other external factors, such as body tem-
perature [77], autaptic and shunting inhibition [78, 79],
short-term plasticity [71] and neuron-glia interactions [80],
may also dramatically change neuronal firing properties.
We predict that these intrinsic and external factors may
play functional roles in regulating neuronal SR. On the
other hand, real neurons are driven by different types of
stochastic noise in the brain. In theory, neuronal noise
stemming from different sources should be modulated by
distinct biological mechanisms [3]. As mentioned above,
synaptic bombardment noise and ion channel noise are
two main types of noise in neural systems, and their joint
effect may dominate the stochastic dynamics of neuronal
firing (Fig. 4(a)). Using the mean-field theory, previous
studies have shown that synaptic bombardment noise is
determined largely by the strengths of excitatory and in-
hibitory inputs, the balance between excitation and inhi-
bition, the correlation among input spike trains and the ef-
fective mean arrival rate of spikes [20,49,81]; furthermore,
unreliable synaptic transmission might be an important
mechanism regulating this type of neuronal noise [20]. Ion
channel noise has been found to be modulated by the mem-
brane potential of neurons, the densities of ion channels
and the area of the membrane patch [8–10]. In the real
brain, we postulate that these two types of neuronal noise
may work together and be jointly responsible for evoking
SR at the single-neuron level.
Compared with a single neuron, a population of neurons
may cooperate with each other and collectively respond to
a weak external driving signal. At an optimal noise inten-
sity, many studies have shown that the maximal response
of neuronal ensembles to a weak driving signal is relatively
stronger than that at the single-neuron level [44–49, 57].
These observations provide evidence that collective neu-
ronal firing might be more efficient in detecting neuronal
information carried by weak driving signals. However, the
performance of population neuronal SR is not only im-
pacted by the abovementioned intrinsic and external reg-
ulatory factors occurring at the signal-neuron level but
can also be significantly modulated by both the network
structure and network parameters (Fig. 4(b)), which are
systematically summarized as follows.
First, intensive statistical analysis of the neuronal con-
nectome at the micro-scale level has revealed that the
wiring diagram of neurons exhibits typical small-world and
scale-free features depending on different conditions [82].
Indeed, there is accumulating computational evidence that
the performance of neuronal SR can be amplified via
fine-tuning of the small-world or scale-free network struc-
tures [41, 83–85]. More importantly, using a spatially em-
bedded network model, an optimal topology between the
heterogeneous scale-free network and the strong random
geometric network has also been determined to optimize
the performance of neuronal SR, revealing that a suitable
Fig. 4: (Color online) Regulation of neuronal SR at different
levels. (a) At the single-neuron level, neuronal SR is mainly
impacted by single neuron firing properties and joint effects of
different types of neuronal noise. (b) At the population level,
neurons respond to external driving signals together, and the
performance of neuronal SR is both influenced by regulatory
factors at the signal-neuron level and modulated by the net-
work structure and parameters.
number of hubs and with that an optimal ratio between
short- and long-range connections is critical for enhancing
SR in neural systems [86].
Second, neurons communicate with each other through
synapses, and different types of synaptic couplings may
contribute distinct effects to neuronal SR (Fig. 4(b)). The
most common type of synapse in the brain is the chemi-
cal synapse, which can be further divided into excitatory
and inhibitory synapses [1]. By establishing neuronal net-
works coupled by chemical synapses, several studies have
demonstrated that a network with strong excitation or in-
hibition dramatically deteriorates the weak signal detec-
tion capability, and a suitable excitation-inhibition bal-
ance in the network can improve the performance of neu-
ronal SR-related behaviors [57, 87]. Furthermore, elec-
trical synapses have also been demonstrated to strongly
impact the dynamics of neural systems. Using a small-
world network composed of Rulkov map-based neurons,
Perc reported that the performance of neuronal SR in-
duced by a pacemaker highly depends on the electrical
coupling strength, and the small-world network property
is able to enhance the SR only for intermediate levels of
coupling strength [83]. Note that a similar effect of elec-
trical synapses has also been found in FitzHugh-Nagumo
(FHN) neurons connected in an array [88], but such en-
hancement seems not to be reproduced in a small-world
neuronal network consisting of HH neurons [41]. In that
study, increasing the electrical coupling strength tends to
decrease the maximal neuronal response to weak periodic
Guo et al.
driving signals [41]. Interestingly, several studies have also
compared the effects of chemical and electrical synapses
on the enhancement of signal propagation via SR-related
mechanisms in neuronal networks with different struc-
tures [65,84,89]. Most relevant studies have indicated that
excitatory synaptic coupling might be more efficient than
electrical synapses in weak signal detection [65,89]. In ad-
dition to normal “feedforward” synapses, neurons might
also be self-innervated via feedback connections referred
to as autapses [79]. It was recently demonstrated that the
performance of pacemaker-induced SR can be improved in
a scale-free network composed of HH neurons at an opti-
mal autaptic coupling strength [90].
Lastly, spike transmission delay due to both the finite
propagation speed and the time lapse occurring by den-
dritic and synaptic integrations is unavoidable in neural
systems [1, 2]. As an intrinsic property of neuronal infor-
mation processing, the transmission delay of spikes may
play a vital role in regulating neuronal SR (Fig. 4(b)). In
particular, several modeling studies have shown that the
spike transmission delay is critical for triggering multiple
SR, and such behavior appears at every integer multiple
of the period of the external driving signal [91, 92]. By
introducing a pacemaker with an autapse, it has also been
reported that multiple SR can be evoked by matched au-
taptic transmission delay in neuronal networks [90]. These
results, however, do not suggest that the transmission de-
lay is a necessary condition for multiple SR in neural sys-
tems. For a sufficiently long period of subthreshold driv-
ing signals, neural systems may exhibit typical multiple
SR even in the absence of spike transmission delay [69].
Additionally, the transmission delay may also modulate
the performance of neuronal SR. Such modulation of neu-
ronal SR shows high sensitivity to stochastic noise [93].
At an optimal noise level, introducing the spike transmis-
sion delay into the network may reduce the capability of
weak signal detection, thus deteriorating the performance
of neuronal SR. When stochastic noise is not at the opti-
mal level, a suitable tuning of transmission delay is able
to assist the ability of noise to detect subthreshold driving
Conclusions and open questions. – Random fluc-
tuations in brain activity are attributed to diverse sources
of neuronal noise and have been widely observed in exper-
imental recordings at different spatiotemporal scales [3,4].
At the micro-scale level, neuronal membrane potentials
also display the feature of voltage fluctuations in millisec-
ond temporal resolution, strongly implying that neurons
may operate in noisy environments. Recent studies have
shown that a suitable level of neuronal noise may not de-
stroy but can guarantee efficient signal processing in the
brain [4,21–24]. In this context, SR is a leading candidate
mechanism for enhancing the capability of weak signal de-
tection in neural systems. Over the last several decades,
remarkable progress has been made in this field using com-
putational approaches [23, 24]. In this mini review, we
systematically summarized different types of SR-related
behaviors observed in modeling studies and also discussed
several possible regulation mechanisms of neuronal SR in
the brain.
Although neuronal SR has been widely investigated in
recent years, there are still several open questions that
deserve further exploration. First, an increasing number
of modeling studies have documented different types of
SR in neural systems, but it is not established whether
some of them can coexist in the same system. In addi-
tion, it remains unknown whether all types of neuronal
SR discovered in modeling studies can be observed in ex-
periments. Further studies combining both computational
and experimental approaches are needed to examine these
two issues. On the other hand, chemical and electrical
synapses coexist within the mammalian brain [1, 2]. How-
ever, previous modeling studies on neuronal SR have sel-
dom included them together in the same network with an
appropriate ratio matching experimentally reported data.
Accordingly, it is still unclear whether these two types of
synapses may perform combination and complementary
roles in evoking and regulating neuronal SR, a question
that can be further tested in future studies.
Looking forward, we must be cognizant that SR is
only one possible mechanism by which neurons may ex-
ploit noise to facilitate signal processing in the brain.
Other noise-induced behaviors, such as stochastic syn-
chronization [94], inverse stochastic resonance [95–98],
phase resetting of collective rhythm [99, 100] and neu-
ral avalanches [101], may also play critical roles in high-
efficiency neuronal information processing. A compre-
hensive theoretical framework that incorporates all these
noise-enhanced phenomena will surely be rewarding in fu-
ture modeling studies.
Acknowledgments. – We sincerely thank Dr. Dong-
ping Yang for his kind help in proofreading our
manuscript. This research was supported by the Na-
tional Natural Science Foundation of China (Grant
Nos. 31771149, 61527815, 81571770, 81771925 and
81371636), the Project of Science and Technology Depart-
ment of Sichuan Province (Grant Nos. 2017HH0001 and
2018HH0003), and the Slovenian Research Agency (Grant
Nos. J1-7009, J4-9302, J1-9112 and P1-0403).
[1] Kandel E. R. et al.,Principles of neural science
(McGraw-Hill, New York) 2012.
[2] Gerstner W. and Kistler W. M.,Spiking neuron mod-
els: Single neurons, populations, plasticity (Cambridge
University Press, Cambridge) 2002.
[3] Destexhe A. and Rudolph-Lilith M.,Neuronal noise
(Springer Science & Business Media) 2012.
[4] Stein R. B. et al.,Nat. Rev. Neurosci.,6(2005) 389.
[5] Stevens C. F.,Biophys.,12 (1972) 1028.
[6] Hille B.,Ionic channels of excitable membranes (Sinauer
Associates, Sunderland, MA) 1992.
Functional importance of noise in neuronal information processing
[7] Strassberg A. F. and DeFelice L. J.,Neural Comp.,
5(1993) 843.
[8] Schmid G. et al.,EPL,56 (2001) 22.
[9] Linaro D. et al.,PLoS Comput. Biol.,7(2011) e1001102.
[10] Goldwyn J. H. and Shea-Brown E.,PLoS Comput.
Biol.,7(2011) e1002247.
[11] Koch C.,Biophysics of computation: information pro-
cessing in single neurons (Oxford University Press, New
York) 1999.
[12] Smetters D. K. and Zador A.,Curr. Biol.,6(1996)
[13] Branco T. and Staras K.,Nat. Rev. Neurosci.,10
(2009) 373.
[14] Guo D. and Li C.,J. Comput. Neurosci.,30 (2010) 567.
[15] Uzuntarla M. et al.,Eur. Phys. J. B,85 (2012) 282
[16] Guo D. and Li C.,Cogn. Neurodyn.,6(2012) 75.
[17] Ho N. and Destexhe A.,J. Neurophysiol.,84 (2000)
[18] Brunel N.,J. Comput. Neurosci.,8(2000) 183.
[19] Brunel N. et al.,Phys. Rev. Lett.,86 (2001) 2186.
[20] Guo D. and Li C.,J. Theor. Biol.,308 (2012) 105.
[21] Ma W. J. et al.,Nat. Neurosci.,9(2006) 1432.
[22] Fanelli D. et al.,Phys. Rev. E,96 (2017) 062313.
[23] McDonnell M. D. and Ward L. M.,Nat. Rev. Neu-
rosci.,12 (2011) 415.
[24] McDonnell M. D. and Abbott D.,PLoS Comput.
Biol.,5(2009) e1000348.
[25] Benzi R. et al.,Tellus,34 (1982) 10.
[26] Gammaitoni L. et al.,Rev. Mod. Phys.,70 (1998) 223.
[27] McNamara B. and Wiesenfeld K.,Phys. Rev. A,39
(1989) 4854.
[28] Berdichevsky V. and Gitterman M.,EPL,36 (1996)
[29] Gingl Z. et al.,,EPL,29 (1995) 191.
[30] Izhikevich E. M.,Dynamical systems in neuroscience:
The geometry of excitability and bursting (The MIT Press)
[31] Douglass J. K. et al.,Nature,365 (1993) 337.
[32] Levin J. E. and Miller J. P.,Nature,380 (1996) 165.
[33] Collins J. J. et al.,J. Neurophysiol.,76 (1996) 642.
[34] Russell D. et al.,Nature,402 (1999) 219.
[35] Gluckman B. J. et al.,Phys. Rev. Lett.,77 (1996) 44098.
[36] Gluckman B. J. et al.,Chaos,8(1998) 588.
[37] Longtin A.,J. Stat. Phys.,70 (1993) 309.
[38] Lee S. G. and Kim S.,Phys. Rev. E,60 (1999) 826.
[39] Bulsara A. et al.,J. Theor. Biol.,152 (1991) 531.
[40] Hodgkin A. L. and Huxley A. F.,J. Physiol.,117
(1952) 500.
[41] Ozer M. and Uzuntarla M.,Phys. Lett. A,373 (2009)
[42] Ullner E. et al.,Phys. Rev. Lett.,91 (2003) 180601.
[43] Holden L. and Erneux L.,SIAM J. Appl. Math,53
(1993) 1045.
[44] Collins J. J. et al.,Nature,376 (1995) 236.
[45] Collins J. J. et al.,Phys. Rev. E,52 (1995) R3321.
[46] Collins J. J. et al.,Phys. Rev. E,54 (1996) 5575.
[47] Stocks N. G. and Mannella R.,Phys. Rev. E,64
(2001) 030902.
[48] Hoch T. et al.,Phys. Rev. E,68 (2003) 011911.
[49] Durrant S. et al.,Phys. Rev. E,84 (2011) 011923.
[50] Plesser H. E. and Geisel T.,Phys. Rev. E,59 (1999)
[51] Liu F. et al.,Phys. Rev. E,59 (1999) 3453.
[52] Yu Y. et al.,Biol. Cybern.,84 (2001) 227.
[53] Guo D. and Li C.,Phys. Rev. E,79 (2009) 051921.
[54] Chialvo D. R. et al.,Phys. Rev. E,65 (2002) 050902.
[55] Chialvo D. R.,Chaos,13 (2003) 1226.
[56] Balenzuela P. et al.,Contemp. Phys.,53 (2012) 17.
[57] Guo D. et al.,Phys. Rev. E,96 (2017) 022415.
[58] Gai Y. et al.,PLoS Comput. Biol.,6(2010) e1000825.
[59] Schmerl B. A. and McDonnell M. D.,Phys. Rev. E,
88 (2013) 052722.
[60] Longtin A.,Phys. Rev. E,55 (1997) 868.
[61] Pikovsky A. S. and Kurths J.,Phys. Rev. Lett.,78
(1997) 775.
[62] Manjarrez E. et al.,Neurosci. Lett.,326 (2002) 93.
[63] Landa P. S. and McClintock P. V. E.,J. Phys. A:
Math. Gen.,33 (2000) 45.
[64] Yu H. et al.,Chaos,21 (2011) 043101.
[65] Sun J. et al.,Appl. Math. Model.,37 (2013) 6311.
[66] Bragin A. et al.,Hippocampus,9(1999) 137.
[67] Vilar J. M. G. and Rub J. M.,Phys. Rev. Lett.,78
(1997) 2882.
[68] Zeng C. H. et al.,Eur. Phys. J. D,62 (2011) 219.
[69] Li H. et al.,Chaos,28 (2018) 043113.
[70] Pinamonti G. et al.,PLoS ONE,7(2012) e51170.
[71] Mejias J. F. and Torres J. J.,PLoS ONE,6(2011)
[72] Connors B. W. and Regehr W. G.,Curr. Biol.,6
(1996) 1560.
[73] ˚
Arhem P. et al.,Biophys. J.,90 (2006) 4392.
[74] Zibman S. et al.,Neurosci.,189 (2011) 51.
[75] Zeng S. Y. et al.,EPL,101 (2013) 38005.
[76] Cervera J. and Maf´
e S.,EPL,102 (2013) 68002.
[77] Yu Y. et al.,PLoS Comput. Biol.,8(2012) e1002456.
[78] Prescott S. A. et al.,J. Neurosci.,26 (2006) 9084.
[79] Guo D. et al.,EPL,114 (2016) 3001.
[80] Wang F. et al.,Sci. Signal.,5(2012) ra26.
[81] Kreuz T. et al.,Phys. Rev. Lett.,97 (2006) 238101.
[82] Varshney L. R. et al.,PLoS Comput. Biol.,7(2011)
[83] Perc M.,Phys. Rev. E,76 (2007) 066203.
[84] Yilmaz E. et al.,Physica A,392 (2013) 5735.
[85] Wu D. et al.,EPL,86 (2009) 50002.
[86] Gosak M. et al.,New J. Phys.,13 (2011) 013012.
[87] Wang J. et al.,Int. J. Mod. Phys. B,30 (2016) 1550253.
[88] Kanamaru T. et al.,Phys. Rev. E,64 (2001) 031908.
[89] Li X. et al.,Phys. Rev. E,76 (2007) 041902.
[90] Yilmaz E. et al.,Physica A,444 (2016) 538.
[91] Wang Q. et al.,Chaos,19 (2009) 023112.
[92] Hao Y. et al.,Neurocomputing,74 (2011) 1748.
[93] Sun X. and Liu Z.,Nonlinear Nyn.,92 (2018) 1707.
[94] Galan R. F. et al.,J. Neurosci.,26 (2006) 3646.
[95] Tuckwell H. C. et al.,Phys. Rev. E,80 (2009) 031907.
[96] Guo D.,Cogn. Neurodyn.,5(2011) 293.
[97] Uzuntarla M. et al.,PLoS Comput. Biol.,13 (2017)
[98] Yamakou M. E. and Jost J.,EPL,120 (2017) 18002.
[99] Levnaji´
c Z. and Pikovsky A.,Phys. Rev. E,82 (2010)
[100] Levnaji´
c Z. and Pikovsky A.,Phys. Rev. Lett.,107
(2011) 034101.
[101] Scarpetta S. and De Candia A.,PLoS ONE,8(2013)
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
We study the dynamics of the excitable Fitz Hugh-Nagumo system under external noisy driving. Noise activates the system producing a sequence of pulses. The coherence of these noise-induced oscillations is shown to be maximal for a certain noise amplitude. This new effect of coherence resonance is explained by different noise dependencies of the activation and the excursion times. A simple one-dimensional model based on the Langevin dynamics is proposed for the quantitative description of this phenomenon.
Full-text available
Periodically stimulated sensory neurons typically exhibit a kind of statistical phase locking to the stimulus: they tend to fire at a preferred phase of the stimulus cycle, but not at every cycle. Hence, the histogram of interspike intervals (ISIH), i.e., of times between successive firings, is multimodal for these neurons, with peaks centered at integer multiples of the driving period. A particular kind of residence time histogram for a large class of noisy bistable systems has recently been shown to exhibit the major features of the neural data. In the present paper, we show that an excitable cell model, the Fitzhugh-Nagumo equations, also reproduces these features when driven by additive periodic and stochastic forces. This model exhibits its own brand of stochastic resonance as the peaks of the ISIH successively go through a maximum when the noise intensity is increased. Further, the presence of a noise-induced limit cycle introduces a third time scale in the problem. This limit cycle is found to modify qualitatively the phase-locking picture, e.g., by suppressing certain peaks in the ISIH. Finally, the role of noise and possibly of stochastic resonance (SR) in the neural encoding of sensory information is discussed.
Full-text available
The stochastic resonance (SR) phenomenon in an overdamped bistable system with multiplicative and additive noise is investigated. The signal-to-noise ratio (SNR) is calculated when two types of modulation signal are added to the system. The effects of the intensities, the frequencies and relative phase shift of two types of modulation signal on the SNR are discussed, respectively. Research results show that: (i) the intensities of two types of modulation signal can enhance the maximum in the SNR as a function of the noise intensity, and the frequencies can restrain it; (ii) the additive modulation signal can enhance the maximum in the SNR as a function of the noise intensity in comparison with the multiplicative modulation signal; (iii) when both modulation frequencies are equal, the SNR as a function of the relative phase shift exhibits multiple maxima. The multiple maxima in the SNR identifies the characteristic of the stochastic multi-resonance phenomenon.
Full-text available
We study the phenomenon of stochastic resonance on Newman–Watts small-world networks consisting of biophysically realistic Hodgkin–Huxley neurons with a tunable intensity of intrinsic noise via voltage-gated ion channels embedded in neuronal membranes. Importantly thereby, the subthreshold periodic driving is introduced to a single neuron of the network, thus acting as a pacemaker trying to impose its rhythm on the whole ensemble. We show that there exists an optimal intensity of intrinsic ion channel noise by which the outreach of the pacemaker extends optimally across the whole network. This stochastic resonance phenomenon can be further amplified via fine-tuning of the small-world network structure, and depends significantly also on the coupling strength among neurons and the driving frequency of the pacemaker. In particular, we demonstrate that the noise-induced transmission of weak localized rhythmic activity peaks when the pacemaker frequency matches the intrinsic frequency of subthreshold oscillations. The implications of our findings for weak signal detection and information propagation across neural networks are discussed.
Full-text available
1. Aperiodic stochastic resonance (ASR) is a phenomenon wherein the response of a nonlinear system to a weak aperiodic input signal is optimized by the presence of a particular, nonzero level of noise. Our objective was to demonstrate ASR experimentally in mammalian cutaneous mechanoreceptors. 2. Experiments were performed on rat slowly adapting type 1 (SA1) afferents. Each neuron was subjected to a perithreshold aperiodic stimulus plus noise. The variance of the noise was varied between trials. The coherence between the aperiodic input stimulus and the response of each SA1 afferent was computed. 3. Of the 12 neurons tested, 11 showed clear ASR behavior: as input noise variance was increased, the stimulus-response coherence rapidly increased to a peak and then slowly decreased. These findings were in contrast with those for the average firing rate, which increased monotonically as a function of input noise variance. 4. This work shows that noise can serve to enhance the response of a sensory neuron to a perithreshold aperiodic input signal. These results suggest a possible functional role for input noise in sensory systems. These findings also indicate that it may be possible to introduce noise artificially into sensory neurons to improve their abilities to detect arbitrary weak signals.
An amplification of random perturbations by the interaction of non-linearities internal to the climatic system with external, orbital forcing is found. This stochastic resonance is investigated in a highly simplified, zero-dimensional climate model. It is conceivable that this new type of resonance might play a role in explaining the 10 5 year peak in the power spectra of paleoclimatic records. DOI: 10.1111/j.2153-3490.1982.tb01787.x
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes. Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation. Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
Here, we consider a noisy, bistable, single neuron model in the presence of periodic external modulation. The modulation induces a correlated switching between states driven by the noise. The information flow through the system, from the modulation, or signal, to the output switching events, leads to a succession of strong peaks in the power spectrum. The signal-to-noise ratio (SNR) obtained from this power spectrum is a measure of the information content in the neuron response. With increasing noise intensity, the SNR passes through a maximum: an effect which has been called stochastic resonance, and which was first advanced as a possible explanation of the observed periodicity in the recurrences of the Earth's ice ages. We treat the problem within the framework of a recently developed approximate theory, valid in the limits of weak noise intensity, weak periodic forcing and low forcing frequency, for both additive and multiplicative noise. Moreover, we have constructed an analog simulator of the neuron which demonstrates the stochastic resonance effect, and with which we have measured the SNRs for comparison with the theoretical results. Our model should be of interest in situations where a single inherently noisy neuron is the receptor of a periodic signal, which is itself noisy, either from the network or from an external source.