ArticlePDF Available

Stochastic Resonance Crossovers in Complex Networks

Authors:

Abstract and Figures

Here we numerically study the emergence of stochastic resonance as a mild phenomenon and how this transforms into an amazing enhancement of the signal-to-noise ratio at several levels of a disturbing ambient noise. The setting is a cooperative, interacting complex system modelled as an Ising-Hopfield network in which the intensity of mutual interactions or "synapses" varies with time in such a way that it accounts for, e.g., a kind of fatigue reported to occur in the cortex. This induces nonequilibrium phase transitions whose rising comes associated to various mechanisms producing two types of resonance. The model thus clarifies the details of the signal transmission and the causes of correlation among noise and signal. We also describe short-time persistent memory states, and conclude on the limited relevance of the network wiring topology. Our results, in qualitative agreement with the observation of excellent transmission of weak signals in the brain when competing with both intrinsic and external noise, are expected to be of wide validity and may have technological application. We also present here a first contact between the model behavior and psychotechnical data.
Content may be subject to copyright.
Stochastic Resonance Crossovers in Complex Networks
Giovanni Pinamonti
¤
, J. Marro, Joaquı
´n J. Torres*
Institute ‘‘Carlos I’’ for Theoretical and Computational Physics, and Department of Electromagnetism and Matter Physics, University of Granada, Granada, Spain
Abstract
Here we numerically study the emergence of stochastic resonance as a mild phenomenon and how this transforms into an
amazing enhancement of the signal-to-noise ratio at several levels of a disturbing ambient noise. The setting is a
cooperative, interacting complex system modelled as an Ising-Hopfield network in which the intensity of mutual
interactions or ‘‘synapses’’ varies with time in such a way that it accounts for, e.g., a kind of fatigue reported to occur in the
cortex. This induces nonequilibrium phase transitions whose rising comes associated to various mechanisms producing two
types of resonance. The model thus clarifies the details of the signal transmission and the causes of correlation among noise
and signal. We also describe short-time persistent memory states, and conclude on the limited relevance of the network
wiring topology. Our results, in qualitative agreement with the observation of excellent transmission of weak signals in the
brain when competing with both intrinsic and external noise, are expected to be of wide validity and may have
technological application. We also present here a first contact between the model behavior and psychotechnical data.
Citation: Pinamonti G, Marro J, Torres JJ (2012) Stochastic Resonance Crossovers in Complex Networks. PLoS ONE 7(12): e51170. doi:10.1371/
journal.pone.0051170
Editor: Ju
¨rgen Kurths, Humboldt University, Germany
Received July 24, 2012; Accepted October 30, 2012; Published December 14, 2012
Copyright: ß2012 Pinamonti et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits
unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Funding: The work was supported by the following: Andalusian Regional Government ‘‘Junta de Andalucı
´a,’’ project number FQM–01505; Spanish Science and
Innovation Ministry MICINN–FEDER, project number FIS2009–08451; and Spanish Science and Innovation Ministry MICINN-GREIB, project number
GREIB.PT_2011_19. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing Interests: The authors have declared that no competing interests exist.
* E-mail: jtorres@onsager.ugr.es
¤ Current address: Dipartimento di Fisica, Universita
`degli Studi di Trieste, Trieste, Italy
Introduction
Ambient fluctuations that are treated as annoying and often
ignored play in fact a fundamental role in nature. For example,
they may transmit information notwithstanding their deceptive
lack of structure (see, e.g., [1,2]), help setting up order at the
macroscopic, mesoscopic and even nanoscopic levels despite their
apparent order-disturbing effect [3], and optimize propagation by
turning the medium into an excitable one [4,5] and inducing
coherence among environmental noise and the periodic part of the
signal, which helps weak inputs to go through without damping.
This is named stochastic resonance (SR) which, believed to occur in
many different instances [6–15], and known to be technologically
relevant, e.g., in designing filters and sensory devices and in
extracting details about waves-traversed geological media [16,17],
is now established as a genuine and common, perhaps universal
phenomenon [18–20].
Deciphering the detailed microscopic mechanisms bringing a
constructive role of diverse fluctuations in such a varied range of
circumstances is puzzling. This goal became even more difficult
after the discovery of stochastic multi-resonance (SMR) in human
perception [21] in accordance with predictions in assorted
contexts, which demands searching for further causes [22–27].
The hallmark of SR is a rise of the power spectral density or the
input-output correlation with increasing strength of a noise which
is competing with the main input signal. The noise tends again to
dominate, so that the signal transmission may be impeded in
practice, past a peak as the noise level is further increased. One
speaks of SMR when several peaks of this kind show up for
different levels of noise.
In this paper, we report on a numerical study of SR and SMR
in the Ising system on a network in which each node is linked to
each other. Such a full wiring is not realistic but this feature is in
practice swept away here by assuming inhomogeneous connectiv-
ity. That is, the interactions or connections are weighted and time
varying following a pattern which has been observed, for instance,
in the central nervous system [28–31]. This transforms in practice
the original regular net into an effective complex network whose
links happen to play an essential role, as described in detail, for
example, in [31] and references therein. The ambient noise is
modelled in our case by the standard thermal bath, and an
external deterministic, time-periodic signal is added to the current
arriving each unit. Using this simple setting, in which one may
think of units and connections as oversimplified neurons and
synapses, respectively, we describe a crossover from SR to SMR
by changing the dynamic properties of synapses. Important
features of SMR phenomena are then tuned by simply modifying
model parameters that have a well-defined physical meaning. Our
study thus deepens on the microscopic basis and therefore on the
detailed nature of SMR as it may occur in an ample family of
complex, cooperative or interacting systems, and we relate SMR
to nonequilibrium phase transitions that are known to bear
relevance to the understanding of some brain functions [32,33].
Methods
Let Nbinary neurons, namely, si= 0 or 1, i~1,:::,N, each
linked to the rest by synapses i<j~1,:::,N,whose intensities or
weights are given by the covariance rule [34]:
PLOS ONE | www.plosone.org 1 December 2012 | Volume 7 | Issue 12 | e51170
vij~1
Np 1{pðÞ
X
m
jm
i{p

jm
j{p

,vii~0:ð1Þ
This, which modifies the familiar Hebbian prescription to avoid
saturation of weights, as if there were a threshold, involves P
patterns, namely, jm
i~0,1

with m~1, ...,P, that are assumed
to have been previously ‘‘learned’’ by the system. The parameter p
in (1) measures the excess of 1’s over 0’s or symmetry in the mean
net activity of the set of patterns, namely, p~Sjm
iTi,m:In practice,
for simplicity and also to avoid specificities concerning this model
feature, we deal here with random patterns in the sense that each
jm
iis given either 0 or 1 at random with the only restriction that
Sjm
iTi,mequals the given value of p.
Evolution with time is by parallel, cellular automata dynamics,
namely, by stochastic changes at each time of the whole set
s~si
fg
according to the probabilities:
Pisitz1ðÞ~s
fg
~1
2zs{1
2

tanh Ii(t)T{1

,Vi:ð2Þ
Here, sequals either 1or 0,Tis the temperature of the
underlaying bath, and
Ii(t)~2hitðÞ{hizA(t)½ ð3Þ
stands for the total input on each neuron. The last term in this
equation is an external signal that we shall first assume to be
A(t)~A0cos(ft)(see, however, the section ‘‘Changing the signal’’
below) where the amplitude A0will in practice be small compared
to the total input, and hiare thresholds for firing, which we take
here equal to half the sum of the weights of all the synapsis
connecting ito the other neurons, hi~1
2XN
j~1vij. The first term
in the rhs of Eq. (3) is the net current from others on neuron i,
which is given by
hitðÞ~X
N
j~1
vij xjtðÞsjtðÞ:ð4Þ
Therefore, we modulate the synaptic weights with the variable
xi(t)that we shall assume to change with time according to the
map [28]:
xi(tz1)~xi(t)z1{xi(t)
a{bxi(t)si(t):ð5Þ
This ansatz could be replaced by direct assumptions on the net
links that have an easy interpretation on physical grounds, see e.g.
[31], without affecting our main results here. Nevertheless, the
choice (5) is simpler and has been previously tested in neuroscience
studies [35]. It amounts to assume a sawtooth–shaped time
change, with aand bmeasuring the teeth width and depth,
respectively, describing a competition of effects associated to
synapses ‘‘fatigue’’. That is, the link of intensity vij xjis debilitated
as bis increased, while decreasing amakes xto recover its
maximum value more rapidly. The link weight effectively remains
constant in practice if such a recovery becomes very fast, so that
one sometimes speaks of ‘‘a~0’’ as the limit of static synapses
which characterizes the standard Ising and Hopfield cases [36,37].
The origin of (5) are differential equations trying to account for the
fact that electrical stimulation due to local and even spatially
extended activity may induce short-term plasticity leading to
depression and sometimes also facilitation of synaptic transmission
[35,38].
The relevant order in this system may be described by
monitoring the firing rate, i.e., m(t)~1
NPisi(t), which is in fact
sometimes recorded in laboratory experiments. Though hardly
experimentally accessible, also interesting to illustrate in detail the
system behavior is the overlap of the actual state with each pattern
m,defined as
mm(t)~1
Np(1{p)X
N
i~1
(jm
i{p)sitðÞ:ð6Þ
Furthermore, we are interested in measuring the intensity of the
input-output correlation, so that we shall compute the function
Cf~lim
t??
1
tðt
0
m(t)exp iftðÞdt,ð7Þ
i.e., the Fourier coefficient at frequency fof the output firing rate.
The relevant correlation, to be denoted CTðÞin the following, is
signal dependent, e.g., we define it in the cosinus case as the value
of Cf,TðÞ:DCfD2=A2
0computed at the frequency of the input
signal.
The phase diagram of the above model with AtðÞ~0Vtwas
examined before [28,29,31,32,39]. The most detailed study so far
concerns the case in which xin (4) is interpreted as a stochastic
variable with distribution inspired in (5) [31]. A main result in this
case, which does not differ essentially from the present one, is its
relevance to better understanding cooperative phenomena in
several fields. In particular, tuning properly parameter values, the
model exhibits familiar equilibrium phases, namely, a disordered
high-Tphase —corresponding to the paramagnetic phase in
condensed matter— in which (the stationary values of) all the
overlaps are practically zero, a low-Tphase with conventional
order —corresponding to ferromagnetism— in which the global
activity converges with time towards one of the attractors jm
i

,so
that it is often taken as a model example of associative memory,
and a —say, spin-glass— phase in which convergence is towards a
mixture of stored patterns. In addition, the system may be tuned to
Figure 1. The signal–to–noise function CTðÞdepicts in this
semilogarithmic plot a shallow resonance for static synapses at
the critical temperature. (Here, A0~0:005,f~0:04, and p~0:5:)
doi:10.1371/journal.pone.0051170.g001
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 2 December 2012 | Volume 7 | Issue 12 | e51170
exhibit nonequilibrium phases [36]. Namely, (i) one in which there
is a rapid and rather irregular roaming among the attractors —
thus closely mimicking, for example, long-time structural changes
and oscillations that have been associated with reaction–diffusion
phenomena in physics and chemistry, as well as efficient, say, states
of attention that are of interest in neuroscience—, (ii) one which is
mainly characterized by oscillations between one of the stored
patterns and its negative or corresponding antipattern, and (iii) one
with quite irregular, apparently chaotic roaming randomly
interrupted by pattern–antipattern oscillations [31]. The case (5)
induces similar though relatively simpler behavior, e.g., the most
involved behavior (iii) does not seem to fully develop in this case.
Results
From single to multiple resonance
We report here on Monte Carlo simulations of the above model.
Exploratory runs showed no essential influence of Nnor Pin the
main behavior of interest, so that we shall report first on the
sufficiently large, typical case N~1000, and will focus on P~1, i.e.,
the only dynamic attractors are a given pattern and its antipattern.
Varying Nand Pis also interesting, however, and we shall latter
be concerned with this. The stored pattern will initially correspond
to p~0:5, which means same number of firing and silent neurons
on the average, but changing pwill be shown later on to modify
importantly the system behavior. Time series for performing
averages consisted of 105Monte Carlo steps.
In the Hopfield limit of static synapses, x(t)~1Vt,the system
exhibits a rather weak resonance. As shown in Fig. 1, a well-
defined though shallow peak in the input-output correlation occurs
Figure 2. Three sets — at different noise level or
temperature
T,as indicated — each with two time series for, respectively, the firing
rate (top of each set) and the overlap (bottom of each set) showing a tendency towards coherence at TC~1:The common external
signal AtðÞand time scale are shown at the bottom below the sets. (Same case as in Fig. 1, except that A0=0.01.)
doi:10.1371/journal.pone.0051170.g002
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 3 December 2012 | Volume 7 | Issue 12 | e51170
around T~TC~1:This is the bath temperature separating the
ferromagnetic phase, for TvTC,from the disordered phase, for
TwTC:The mechanism behind this behavior is illustrated in
Fig. 2. This exhibits typical time series corresponding to the two
relevant equilibrium phases. Namely, one is characterized by non-
zero overlap —in fact, this is close to its maximum in our example
shown as the second graph of the top set for T~0:4— and the
other by zero overlap —i.e., small-amplitude fluctuations around
zero as in the bottom set. This figure also exhibits a near-critical
condition (middle set) in which the overlap shows larger-amplitude
fluctuations. It is remarkable that only in the latter case with
T&TCis the firing rate clearly coupled to the cosinus within AtðÞ;
the overlap also happens to be somewhat coupled here to the
signal but this is not obvious to the naked eye in Fig. 2. The
familiar critical bistability resulting from a competition between
thermal fluctuations and —static though non-homogeneous— node
interactions is in this case the mechanism [18,19] that allows the
weak signal to prevail despite the noise.
More involved behavior shows up when synapses are dynamic,
namely, xin (4) varies with time as stated in (5). As a matter of fact,
one may then expect changes in the transmission of signals, given
the very different development of order which occurs depending
on the parameter values in this case, as we described at the end of
the previous section.
Fig. 3 illustrates the case as one modifies the depression
parameter ain (5). The SR maximum is still clearly depicted for
any a,but it corresponds now to the transition between the
equilibrium disordered phase and the nonequilibrium one
characterized by (possibly irregular) oscillations of the global
activity —that is, the phase identified (ii) above. Furthermore, two
other main differences arise. One is that the peak location moves
as aincreases towards lower temperature, in agreement with a
reported scaling of the critical temperature with synaptic
depression [28]. Furthermore, there is a factor of near 10
3
in
the vertical scale here as compared to the one in Fig. 1, namely,
the resonance effect is now much stronger, though the signal for
this figure is even weaker than in the simulation before for static
synapses.
Actually more intriguing is some indication of SMR for
dynamic synapses, i.e., CTðÞtends to form and sometimes
develops a plateau at low temperature which seems to announce a
second resonance peak having a different origine that will finally
show up for p=0:5:The tendency is not fully materialized here,
however, due to our restriction so far to strictly symmetric patterns
(p~0:5), which induces some symmetry of the connection
intensities, as we discuss next.
Effects of asymmetry
The fact that the incipient correlation plateaus in Fig. 3 are
associated to the mechanisms inducing transitions between the
equilibrium-memory and nonequilibrium-oscillatory phases is
confirmed by analysis of the corresponding time series (not
shown). That is, one observes that the overlap then describes rapid
oscillations between the stored pattern and its antipattern that are
definitely correlated with the signal waving. Closer inspection does
not evidence any such correlations in the firing rate series,
however. Consequently, the function CTðÞ—which derives from
mtðÞ— shows no definite peak. This apparent inconsistency is
because, in as long as one considers p~0:5, the firing rate, unlike
the overlap, fluctuates with only small amplitude, around m~0:5
in practice. It follows that analyzing p=0:5is needed now,
specially after one notices that the asymmetric case is in fact the
only bearing interest for hypothetical realizations of this resonance
phenomenology in the laboratory.
Figs. 4 and 5 illustrate the change of behavior as the mean
neuron activity in the pattern, p,is modified. The first one shows
that any asymmetry in the number of firing and silent neurons
induces SMR, namely, a sharp peak (together with some
‘‘harmonics’’) at very low T,near the transition between memory
and oscillatory phases, and a cleaner and somewhat less
pronounced peak at higher T,near the transition between
oscillatory and disordered phases. Interesting enough, the reso-
nance is enhanced with increasing asymmetry. We also notice that,
as expected, the underlying pattern-antipattern symmetry induces
the same behavior for pw0:5than for pv0:5:
Fig. 5 clearly depicts the nature of the low-temperature
resonance peak and how this is associated with asymmetry. That
is, the oscillations of the firing rate are essentially different for the
two cases of correlated behavior. One observes at T~0:045 a
behavior that resembles the one for the middle set in Fig. 2. This is
a critical condition, corresponding to a second–order phase
transition, in which the resonance is essentially induced by noise
and long–ranged correlations. There are oscillations of both mtðÞ
and mmtðÞthat are definitely correlated with those of AtðÞ—which
results in the high-Tresonance peak— but occurring between
states that, due to the underlaying noise, are not strongly
Figure 3. Different resonance curves CTðÞas one modifies the
value of ain (5), as indicated, for A0~0:001, f~0:04 and b~0:5:
doi:10.1371/journal.pone.0051170.g003
Figure 4. Resonance curves when one introduces an essential
asymmetry by varying the mean neuron activity in the stored
pattern, p,as indicated. (Here, A0~0:001, f~0:04, b~0:5and
a~80:)
doi:10.1371/journal.pone.0051170.g004
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 4 December 2012 | Volume 7 | Issue 12 | e51170
correlated with the information content, as one should have
expected given that jumping is now practically among the store
pattern and a disordered phase. Perhaps the most striking
observation here is that mmtðÞsubtly correlates with the signal,
namely, it occurs as a modulation in the amplitude of the pattern–
antipattern oscillations (see middle panel of the bottom left set in
Fig. 5). Also interesting is that, in spite of the noise in this case, the
weak signal is able to correlate with the neurons activity therefore
affecting the processing of information at very short time scales, as
discussed further in the next section.
The relevant mechanism happens to be qualitatively different
near the low-Tresonance peak, e.g. T~0:0076 in Fig. 5. Both the
firing rate and the overlap now show abrupt oscillations with
precisely the same frequency and strongly correlated with AtðÞ:In
particular, the low (high) firing metastable states corresponding to
high (low) overlap —i.e., transitions between the two only possible
levels of neural activity in the (normal) case of asymmetric
patterns— are synchronized to the maxima (minima) of the
cosinus signal. As in a first–order phase transition, and unlike the
high-Tcase, such a strong correlation tends to diminish sharply as
Tis either increased or decreased even slightly, Fig. 5 reveals.
Furthermore, none of the time series, mtðÞand mmtðÞ,display
superimposed fluctuations, confirming that the noise, even though
necessary, is not here the relevant cause. The control is now in the
weak signal, and the global activity changes correlated with the
information content during a relatively long time, namely, one at
least of order of the signal period.
Figure 5. Time series for the firing rate (top graph of each set) and for the overlap (bottom graph of each set) at different
temperature, as indicated, in the asymmetric case p~0:45.(Other parameters as in Fig. 4.) The second set from top in the right column
corresponds to the low-Tpeak; the bottom set in the left column corresponds to the high-Tpeak. The common external signal AtðÞand time scale
are shown at the bottom below the sets.
doi:10.1371/journal.pone.0051170.g005
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 5 December 2012 | Volume 7 | Issue 12 | e51170
Fig. 6 illustrates the situation for p=0:5as one changes a:On
one hand, the behavior happens to be similar to the one for SR as
observed above in the symmetric case (cf. Fig. 3), namely,
increasing (decreasing) ashifts the peaks to lower (higher) Tand,
at the same time, the high of the peak increases (decreases). On the
other hand, the two peaks tend to merge into a single one as ais
decreased, and the low-Tpeak does not really show up in practice
for any av10:A main conclusion is therefore that SMR requires
both asymmetry of the patterns concerning pin (1), which is in fact
a general property of nature, and large enough values of the
parameter acharacterizing the synaptic changes in (5), i.e., a
complex functionality of connections —even though the actual
wiring may be a simple, fully-connected one.
Changing the signal
The above suggests that the details of the input signal may also
have an effect on resonance. Indeed, Fig. 7 reveals a substantial
influence of the amplitude A0,and confirms the different nature of
the two peaks. While the high-Tpeak remains constant, the low-T
peak strongly increases with A0for p=0:5:This is due to the
normalization of CTðÞwith respect to A0:That is, since the
oscillations that correspond to the first peak are fixed in amplitude
(the system is switching between pattern and antipattern), the
normalization factor leads to the inverse dependence between the
peak height and the signal amplitude. This is not the case for the
second resonance peak because the amplitude of the oscillations in
the firing rate also increases with A0:This peak of Cthus remains
constant, maintaining its shape and height independently of the
value for A0:Such differences are a consequence of what we
observed above in relation with Fig. 5. That is, the behavior
around T~0:045 is determined more by the signal —and,
therefore, by A0— than by the well to be overcome at the
transition point, while the well depth dominates over the signal
influence around the (first–order) transition in T~0:0076.
We also checked the robustness of behavior in relation to the
nature of the signal. Let us consider, which is a familiar case, a
non-homogeneous Poissonian spike train with an instantaneous
firing rate modulated by a slow sinusoidal function. That is,
instead of a cosinus, we shall now use in Eq. (3) the signal
A(t)~A0Pk
i~1d(t{ti)were the occurrence times tiare gener-
ated from a non-homogeneous Poisson process of mean
l(t)~l01zacos ftðÞ½,i.e., varying with time. This is believed to
be more realistic than a sinus or a cosinus, at least for neural
systems, e.g., this is sometimes assumed to represent the spike
activity of a neuron in sensory areas processing structured external
signals from senses. This choice is also a more general function,
which eliminates specific features of the cosinus and includes both
stochasticity (inherent here to the Poisson process) and some
quasi–periodic structure codifying relevant information, which is
important for the involved phenomena.
A first observation is that, as Fig. 8 illustrates, no essential
qualitative changes occur using one or the other signal in a typical
case of SMR. On the other hand, inspection of time series as those
in Fig. 9 shows again indications of the different nature of the two
peaks. At low T,e.g., T~0:007 in this figure, the firing rate
switches from low to high mean activity each time a train or burst
of inputs arrives. Once the stimulus ends or the arriving signals
become sparse, the system stays at the metastable state of high
activity —as it occurs in Fig. 5 for the cosinus maxima— until
synapses depress, due to such staying at high activity, and the
metastable state destabilizes. It seems sensible to link this behavior
with that in a hypothetical working memory context in which the
activity persists for some time after the stimulus has ceased. As a
matter of fact, a sort of short–term synaptic plasticity which
reminds one of this situation has already been proposed [40,41].
On the contrary, the system processes without slothfulness at high
T,e.g., T~0:045 in Fig. 9. That is, a single spike input induces
switching from low to high activity, and the high activity state
persists but only during the duration of the stimulus, so that any
temporal structure encoded in the signal is precisely processed at
the high-Tresonance.
Discussion
We here studied the origin of stochastic resonance as it occurs in
a biologically-motivated Ising-Hopfield model system with thre-
sholded neurons and dynamic synapses. This results in an
interacting complex network, namely, one in which the intensity
of connections is inhomogeneously distributed and varies with
time, which essentially influences functionality. For a wide range of
parameter values, the system shows intense resonance for different
levels of noise. More specifically, as the noise is increased in case
P~1, i.e., when the system stores a single pattern, the network
Figure 6. Resonance curves for varying a,as indicated, when
p~0:45 and b~0:5, for a sinusoidal signal with A0~0:001 and
f0~0:04:
doi:10.1371/journal.pone.0051170.g006 Figure 7. Effect of varying the amplitude A0for p~0:45. The inset
shows the dependence on A0of the amplitude of the oscillations of
m(t)for each of the two peaks.
doi:10.1371/journal.pone.0051170.g007
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 6 December 2012 | Volume 7 | Issue 12 | e51170
activity passes from a resting state with some activity around this
pattern to a phase in which this situation destabilizes and the
global activity oscillates between the metastable states correspond-
ing to the pattern and its antipattern configurations. When the
noise increases even more, the pattern–antipattern oscillations
wash out and a disordered phase emerges. Interesting enough,
SMR happens to require in this setting some synaptic depression,
so that the relevant phases occur —and the stored pattern to be
asymmetric as it is always the case in practice. Two resonance
peaks —namely, sudden increase of the efficiency in transmitting a
weak signal through two different levels of the environmental
noise— are then exhibited that are associated with the transitions
points between the phases.
The nature of the peaks importantly differs from each other.
The low noise one is mainly due to the coupling between the
frequency of the pattern–antipattern oscillations —associated to
the occurrence of nonequilibrium phases— and the waving of the
input signal. The high noise peak, however, ensues when a
modulation of the amplitude of these oscillations (and not the
pattern–antipattern oscillations themselves) correlates with the
signal. This relevant modulation clearly manifests itself as a noisy
slow oscillation in the firing rate, as illustrated by the inset of Fig. 7
Figure 9. Time series for different values of T,as indicated, corresponding to the SMR curve in Fig. 8 for the Poissonian input train
(shown below each set with the time scale). The resonances occur in this case around T~0:007 (second set in the left collumn) and T~0:045
(third set in the left column).
doi:10.1371/journal.pone.0051170.g009
Figure 8. Resonance curves for a sinusoidal signal and for a
non-homogeneous Poissonian input train (in this case, CTðÞ
stands for DCfD2=A2
0l2
0at the modulation frequency fof the non-
homogeneous Poissonian process rate). Here, p~0:45, a~80,
b~0:5, f~0:04, and A0~0:001 for the sinus and A0~0:005,l0~0:05,
and a~0:75 for the Poissonian signal.
doi:10.1371/journal.pone.0051170.g008
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 7 December 2012 | Volume 7 | Issue 12 | e51170
showing how the amplitude of the firing rate oscillations increases
with the amplitude of the signal.
The peaks not only differ in their birth mechanism but also in
the way the signal is processed. This is made evident when an
inhomogeneous Poissonian spike train of small amplitude is used
as input signal. Around the low-noise peak, the system activity
rather tends to follow the signal every time a burst of spikes arrives,
and it remains excited for a time, which is short but larger than the
stimulus duration, until the synaptic fatigue mechanism destabi-
lizes such metastability. This is precisely the basic microscopic
origin of peculiar properties reported to occur in nature such as
undamped propagation in excitable media [4,32,33], and it may
also be interpreted as a sort of short–term memory mechanism
able to maintain information for, say, a few seconds as in the so–
called sensory and working memories. The situation essentially
changes around the high-noise peak, where the system detects
each single input spike, that is, the finest time structure of the
underlying signal.
We also checked how SMR is affected by varying the number P
of stored patterns. This is interesting for completeness but also
because the global activity becomes for Pw1even more complex.
That is, the system then tends to keep visiting all the stored
patterns and their antipatterns, and it may do this by following
quite irregular, even chaotic paths [31]. As Fig. 10 shows,
increasing Pfor a fixed frequency fof the input signal (left), the
high–noise resonance slightly increases and moves a little bit
towards lower T,and the low-Tpeak markedly decreases while
moving to lower T:This is due to the fact that increasing Ptends
to increase the frequency of the pattern–antipattern oscillations of
the firing rate and, therefore, to decorrelate the firing rate from the
input signal. This is as expected because the memory capacity of
the standard Ising–Hopfield model is known to generally decrease
due to interference among the stored patterns [42]. For a given
value of P,on the other hand, the height of the low-noise peak
increases with the frequency fof the signal as this approaches the
frequency of the pattern–antipattern oscillations (right graph in
Fig. 10). The net result is therefore that SMR is robust for a range
of Pvalues as far as input signals are of high frequency, while one
should expect the low–frequency signals to be poorly processed.
A picture similar to the one in Fig. 1 was reported before in
settings that are close to ours here but involving serious restrictive
conditions [43–45]. In particular, a recent study within the linear
and mean-field approximations of the Ising model with —
constant and homogeneous — ferromagnetic interactions under
an oscillating magnetic field [44,45] describes resonance behavior
when the wiring of connections is not homogeneous. The outcome
happens to depend crucially on specific properties of the involved
network structure, and the resonance resembles the one in Fig. 1
when the degree distribution obeys a power law *k{cwith cw3:
In spite of its interest for other purposes [46–48], the relevance of
the Ising model on scale-free networks is perhaps questionable
within the present context. That is, large values of care generally
not observed in nature, and the system is physically anomalous due
to finite-size effects for 2vcv3[44–48]. On the contrary, it is
remarkable in our model that defining its wiring a situation in
which all neurons are in principle connected to each other, the
Figure 12. The experimental data (symbols with the corre-
sponding error bars) reported in [54]are plotted here against
our theoretical prediction (red solid line) corresponding to the
case p~0:45 in Fig. 4. To obtain this fit, the experimental data Cwith
arbitrary units are multiplied by a factor 180, and the external noise
amplitude N(which is given in dB) needed to be transformed into our
intrinsic noise parameter Tusing the nonlinear relationship
T~10{4T0zgN
21zerf((N{N0)=ffiffiffi
2
psN)

with T0~5, g~7:7,
N0~50dB, and sN~26:19dB.
doi:10.1371/journal.pone.0051170.g012
Figure 10. Left: Resonance curves for f~0:04 as the number P
of stored patterns is varied, suggesting that the low-T
resonance tends to disappear with increasing P. Right: Reso-
nance curves for P~5as one varies the signal frequency f. This shows
the contrary effect, i.e., the low-Tresonance intensity increases with f.
(Here, p~0:45, A0~0:001, a~80, and b~0:5):
doi:10.1371/journal.pone.0051170.g010
Figure 11. Effect of the network size Non SMR. The inset shows
how the value of Tlocating the low (circles) and high (squares) noise
peaks depends on N. This is for a sinusoidal signal with A0~0:001 and
f0~0:04, and p~0:45, a~80 and b~0:5:
doi:10.1371/journal.pone.0051170.g011
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 8 December 2012 | Volume 7 | Issue 12 | e51170
intensity of connections is not homogeneous and constantly varies
with time. This in fact induces a real complex functionality of the
network which is likely to correspond more generally to the one in
nature [49–53].
Fig. 11, on the other hand, shows how the results in this paper
do not depend essentially on the network size N:That is, SMR
occurs qualitatively the same for a range of sizes, and the value of
noise at which the peaks develop depends on Nbut tends soon to
saturate at a constant value. This is interesting because the neural
systems that we attempt to describe are far from being infinite in the
thermodynamic sense but correspond to relatively small values of
N:
Finally, we comment on possible experimental realizations of
SMR. Some limited data from a psychotechnical experiment [54–
56] concerning the human cortex were recently interpreted in the
light of SMR using a simple model consisting of FitzHugh–
Nagumo neurons [57–60], which account for adaptive thresholds
and fatigue–enduring synapses [21]. This in fact motivated the
present study of a similar situation in a complex network. We
therefore attempted a new contact between those experimental
data and the present model; figure 12 shows the result, which is
encouraging. No doubt that further experiments trying to confirm
SMR, which will thus clarify the possible existence of intriguing
mechanisms as suggested by the model in this paper, will be most
welcome.
Author Contributions
Conceived and designed the experiments: JJT JM GP. Performed the
experiments: GP. Analyzed the data: GP JJT JM. Wrote the paper: JM JJT
GP.
References
1. Golyandina N, Nekrutkin V, Zhigljavsky A (2001) Analysis of Time Series
Structure: SSA and Related Techniques. CRC Press.
2. Dimova II, Kolma PN, Maclina L, Shibera DYC (2012) Hidden noise structure
and random matrix models of stock correlations. Quantive Finance 12: 567–572.
3. Sague´s F, Sancho JM, Garcı
´a-Ojalvo J (2007) Spatiotemporal order out of noise.
Rev Mod Phys 79: 829–882. See also, for instance, ‘‘Rene´ Descartes on
snowflakes’’, supplemental material for Furukawa Y, Wettlaufer JS (2007) Snow
and ice crystals. Physics Today 60: 70–71.
4. Jung P, Mayer-Kress G (1995) Spatiotemporal stochastic resonance in excitable
media. Phys Rev Lett 74: 2130–2133.
5. Lindner B, Garca-Ojalvo J, Neiman A, Schimansky-Geier L (2004) Effects of
noise in excitable systems. Physics Reports 392: 321–424.
6. Benzi R, Sutera A, Vulpiani A (1981) The mechanism of stochastic resonance.
J of Phys A: Math and Gen 14: L453.
7. Wiesenfeld K, Moss F (1995) Stochastic resonance and the benefits of noise:
from ice ages to crayfish and squids. Nature 373: 33–36.
8. Anishchenko VS, Neiman AB, Moss F, Shimansky-Geier L (1999) Stochastic
resonance: noiseenhanced order. Physics-Uspekhi 42: 7–36.
9. Krawiecki A, Holyst JA (2003) Stochastic resonance as a model for financial
market crashes and bubbles. Physica A 317: 597–608.
10. Munakata T, Sato AH, Hada T (2005) Stochastic resonance in a simple
threshold system from a static mutual information point of view. J Phys Soc
Japan 74: 2094–2098.
11. Sato AH (2006) Frequency analysis of tick quotes on foreign currency markets
and the doublethreshold agent model. Physica A 369: 753–764.
12. McDonell MD, Stocks NG, Pearce CEM, Abbott D (2008) Stochastic
Resonance: From Suprathreshold Stochastic Resonance to Stochastic Signal
Quantisation. Cambridge University Press.
13. Special issue ‘‘Stochastic resonance’’ (2009) Eur Phys J B 69:1.
14. Ghosh PK, Marchesoni F, Savel’ev SE, Nori F (2010) Geometric stochastic
resonance. Phys Rev Lett 104: 020601.
15. Tuckwell HC, Jost J (2012) Analysis of inverse stochastic resonance and the long-
term firing of hodgkin-huxley neurons with gaussian noise. Submitted,
arXiv:1202.249.
16. Weaver RL, Lobkis OI (2001) Ultrasonics without a source: Thermal fluctuation
correlations at mhz frequencies. Phys Rev Lett 87: 134301.
17. Snieder R, Wapenaar K (2010) Imaging with ambient noise. Physics Today 63:
44–49.
18. McNamara B, Wiesenfeld K (1989) Theory of stochastic resonance. Phys Rev A
39: 4854–4869.
19. Gammaitoni L, Marchesoni F, Menichella-Saetta E, Santucci S (1989)
Stochastic resonance in bistable systems. Phys Rev Lett 62: 349–352.
20. Fulinski A, Gra PF (2000) Universal character of stochastic resonance and a
constructive role of white noise. J Stat Phys 101: 483–493.
21. Torres JJ, Marro J, Mejias JF (2011) Can intrinsic noise induce various resonant
peaks? New J of Physics 13: 053014.
22. Vilar JMG, Rubı
´JM (1997) Stochastic multiresonance. Phys Rev Lett 78: 2882–
2885.
23. Kim BJ, Minnhagen P, Kim HJ, Choi MY, Jeon GS (2001) Double stochastic
resonance peaks in systems with dynamic phase transitions. EPL 56: 333.
24. Hong H (2005) Enhancement of coherent response by quenched disorder. Phys
Rev E 71: 021102.
25. Barbi M, Reale L (200 5) Stochastic resonance in the lif models with input or
threshold noise. Biosystems 79: 61–66.
26. Tessone CJ, Mirasso CR, Toral R, Gunton JD (2006) Diversity-induced
resonance. Phys Rev Lett 97: 194101.
27. Zhang J, Liu J, Chen H (2008) Selective effects of noise by stochastic multi-
resonance in coupled cells system. Sci China Ser G 51: 492–498.
28. Pantic L, Torres JJ, Kappen HJ, Gielen SCAM (2002) Associative memory with
dynamic synapses. Neural Comput 14: 2903–2923.
29. Torres JJ, Cortes JM, Marro J, Kappen HJ (2008) Competition between synaptic
depression and facilitation in attractor neural networks. Neural Comput 19:
2739–2755.
30. Mejias JF, Hernandez-Gomez B, Torres JJ (2012) Short-term synaptic
facilitation improves information retrieval in noisy neural networks. EPL 97:
48008.
31. de Franciscis S, Torres JJ, Marro J (2010) Unstable dynamics, nonequilibrium
phases, and criticality in networked excitable media. Phys Rev E 82: 041105.
32. Marro J, Torres JJ, Cortes JM (2008) Complex behavior in a network with time-
dependent connections and silent nodes. J Stat Mech 2008: P02017.
33. Torres JJ, Marro J, Cortes JM, Wemmenhove B (2008) Instabilities in attractor
networks with fast synaptic fluctuations and partial updating of the neurons
activity. Neural Networks 21: 1272–1277.
34. Sejnowski TJ (1977) Storin g covariance with nonlinearly interacting neurons.
J Math Biol 4: 303–321.
35. Tsodyks MV, Markram H (1997) The neural code between neocortical
pyramidal neurons depends on neurotransmitter release probability. Proc Natl
Acad Sci USA 94: 719–723.
36. Marro J, Dickman R (1999) Nonequilibrium Phase Transitions in Lattice
Models. Cambridge University Press.
37. Hopfield JJ (1982) Neural networks and physical systems with emergent
collective computational abilities. Proc Natl Acad Sci USA 79: 2554–2558.
38. Jimbo Y, Tateno T, Robinson HP (1999) Simultaneous induction of pathway-
specific potentiation and depression in networks of cortical neurons. Biophys J
76: 670–678.
39. Cortes JM, Torres JJ, Marro J, Garrido PL, Kappen HJ (2006) Effects of fast
presynaptic noise in attractor neural networks. Neural Comput 18: 614–633.
40. Hempel CM, Hartman KH, Wang XJ, Turrigiano GG, Nelson SB (2000)
Multiple forms of shortterm plasticity at excitatory synapses in rat medial
prefrontal cortex. J Neurophysiol 83: 3031–3041.
41. Mongillo G, Barak O, Tsodyks M (2008) Synaptic theory of working memory.
Science 319: 1543–1546.
42. Amit DJ (1989) Modeling brain function: the world of attractor neural network.
Cambridge University Press.
43. Brey JJ, Prados A (1996) Stochastic resonance in a one-dimensional ising model.
Physics Letters A 216: 240–246.
44. Krawiecki A (2008) Stochastic multiresonance in the ising model on scale-free
networks. Acta Phys Polonica B 39: 1103–1114.
45. Krawiecki A (2009) Structural stochastic multiresonance in the ising model on
scale-free networks. Eur Phys J B 69: 81–86.
46. Torres JJ, Munoz MA, Marro J, Garrido PL (2004) Influence of topology on the
performance of a neural network. Neurocomputing 58–60: 229–234.
47. Johnson S, Marro J, Torres JJ (2008) Functional optimization in complex
excitable networks. EPL 83: 46006.
48. de Franciscis S, Johnson S, Torres JJ (2011) Enhancing neural-network
performance via assortativity. Phys Rev E 83: 036114.
49. Eguı
´luz VM, Chialvo DR, Cecchi GA, Baliki M, Apkarian AV (2005) Scale-free
brain functional networks. Phys Rev Lett 94: 018102.
50. Honey CJ, Ktter R, Breakspear M, Sporns O (2007) Network structure of
cerebral cortex shapes functional connectivity on multiple time scales.
Proceedings of the National Academy of Sciences 104: 10240–10245.
51. Petermann T, Thiagarajan TC, Lebedev MA, Nicolelis MAL, Chialvo DR, et
al. (2009) Spontaneous cortical activity in awake monkeys composed of neuronal
avalanches. Proceedings of the National Academy of Sciences 106: 15921–
15926.
52. Friedman N, Ito S, Brinkman BAW, Shimono M, DeVille REL, et al. (2012)
Universal critical dynamics in high resolution neuronal avalanche data. Phys
Rev Lett 108: 208102.
53. Radicchi F, Baronchelli A, Amaral LAN (2012) Rationality, irrat ionality and
escalating behavior in lowest unique bid auctions. PLoS ONE 7: e29910.
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 9 December 2012 | Volume 7 | Issue 12 | e51170
54. Yasuda H, Miyaoka T, Horiguchi J, Yasuda A, Hanggi P, et al. (2008) Novel
class of neural stochastic resonance and error-free information transfer. Phys
Rev Lett 100: 118103.
55. Lugo E, Doti R, Faubert J (2008) Ubiquitous crossmodal stochastic resonance in
humans: Auditory noise facilitates tactile, visual and proprioceptive sensations.
PLoS ONE 3: e2860.
56. Colgin LL, Denninger T, Fyhn M, Hafting T, Bonnevie1 T, et al. (2009)
Frequency of gamma oscillations routes flow of information in the hippocampus.
Nature 462: 353–357.
57. FitzHugh R (1961) Impulses and phy siological states in theoretical models of
nerve membrane. Biophys J 1: 445–466.
58. Nagumo J, Arimoto S, Yoshizawa S (1962) An active pulse transmission line
simulating nerve axon. Proceedings of the IRE 50: 2061–2070.
59. Izu´ s GG, Deza RR, Wio HS (1998) Exact nonequilibrium potential for the
fitzhugh-nagumo model in the excitable and bistable regimes. Phys Rev E 58:
93–98.
60. Izhikevich EM (2007) Dynamical Systems in Neuroscience: The Geometry of
Excitability and Bursting. The MIT Press.
Stochastic Resonance Crossovers
PLOS ONE | www.plosone.org 10 December 2012 | Volume 7 | Issue 12 | e51170
... Then, Mejias and Torres found that the combination of depression and facilitation synapses can enhance the storage capacity (Mejias and Torres, 2009) and furthermore, Mejias et al. generated phase diagrams, indicating that synaptic facilitation enlarges the memory phase region (Mejias et al., 2012). Further, dynamic synapses play a role in stochastic resonance, where a weak input signal to a network can be detected in an output signal under certain conditions (Pantic et al., 2003;Torres, 2008, 2011;Torres et al., 2011;Pinamonti et al., 2012;Torres and Marro, 2015). Pantic et al. have shown that a neuron with depression synapses is capable of detecting noisy input signals with a wider frequency range, compared to one with static synapses, under a certain firing threshold (Pantic et al., 2003). ...
... Torres et al. demonstrated that a model with this interplay can predict experimental data of stochastic resonance . Furthermore, Pinamonti et al. demonstrated that stochastic resonance is well enhanced near phase transitions among patterns in an associative memory network (Pinamonti et al., 2012). Then, Torres and Marro generated a detailed phase diagram embedding many patterns associated with stochastic resonance, such that multiple noise levels are well responsible for optimizing input signals (Torres and Marro, 2015). ...
Article
Full-text available
We investigate a discrete-time network model composed of excitatory and inhibitory neurons and dynamic synapses with the aim at revealing dynamical properties behind oscillatory phenomena possibly related to brain functions. We use a stochastic neural network model to derive the corresponding macroscopic mean field dynamics, and subsequently analyze the dynamical properties of the network. In addition to slow and fast oscillations arising from excitatory and inhibitory networks, respectively, we show that the interaction between these two networks generates phase-amplitude cross-frequency coupling (CFC), in which multiple different frequency components coexist and the amplitude of the fast oscillation is modulated by the phase of the slow oscillation. Furthermore, we clarify the detailed properties of the oscillatory phenomena by applying the bifurcation analysis to the mean field model, and accordingly show that the intermittent and the continuous CFCs can be characterized by an aperiodic orbit on a closed curve and one on a torus, respectively. These two CFC modes switch depending on the coupling strength from the excitatory to inhibitory networks, via the saddle-node cycle bifurcation of a one-dimensional torus in map (MT1SNC), and may be associated with the function of multi-item representation. We believe that the present model might have potential for studying possible functional roles of phase-amplitude CFC in the cerebral cortex.
... The enhanced sensibility of neural networks to external stimuli due to dynamic synapses provides a controlled mechanism to efficiently process weak signals in a background of noisy activity. In fact, short-term synaptic plasticity together with nonlinear mechanisms affecting neuron excitability can induce efficient signal detection at different noise levels [53][54][55]. ...
... These studies suggest that relevant phenomena appear as a consequence of the instabilities induced by the interplay between the underlying noise and synaptic depression, which could also be discussed in a context where intrinsic subthreshold oscillations are present. In spite of its subthreshold nature, the information encoded in these oscillations could be efficiently processed by dynamic synapses at different noise levels through stochastic resonance mechanisms [53][54][55]. ...
Article
In this paper we analyze the interplay between the subthreshold oscillations of a single neuron conductance-based model and the short-term plasticity of a dynamic synapse with a depressing mechanism. In previous research, the computational properties of subthreshold oscillations and dynamic synapses have been studied separately. Our results show that dynamic synapses can influence different aspects of the dynamics of neuronal subthreshold oscillations. Factors such as maximum hyperpolarization level, oscillation amplitude and frequency or the resulting firing threshold are modulated by synaptic depression, which can even make subthreshold oscillations disappear. This influence reshapes the postsynaptic neuron's resonant properties arising from subthreshold oscillations and leads to specific input/output relations. We also study the neuron's response to another simultaneous input in the context of this modulation, and show a distinct contextual processing as a function of the depression, in particular for detection of signals through weak synapses. Intrinsic oscillations dynamics can be combined with the characteristic time scale of the modulatory input received by a dynamic synapse to build cost-effective cell/channel-specific information discrimination mechanisms, beyond simple resonances. In this regard, we discuss the functional implications of synaptic depression modulation on intrinsic subthreshold dynamics.
... 50 Before we discuss our main results in the next section, it is convenient to describe qualitatively here the variety of phases at which the system stabilizes (generally after a transient time) depending on the noise intensity. This phenomenology was essentially reported before, 19,51 but there are important peculiarities and some new facts. As outlined in fig.2, one may distinguish seven qualitatively different cases as follows: ...
... This fact in a sense characterizes C(D) as the order parameter or perhaps the susceptibility for these nonequilibrium phase transitions. It follows the important result that monitoring appropriate quantities, namely, input/output correlations, e.g., in psychotechnic experiments concerning the transmission of weak signals competing with noise, 19,51,52 one may locate high level brain functions, including associated critical conditions. Note that such a task will in practice be eased by the fact that the peaks substantially increase with d, while the phase diagram remains practically unchanged (cf. the cases in figures 2 and 6). ...
Article
Full-text available
We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification-in fact, we considered from a fully connected network to the Homo sapiens connectome-showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.
... Stochastic multiresonance (SMR) is a type of SR in which the periodicity of the response of a nonlinear system to a weak oscillating signal is maximized for several intensities of the external or internal noise [34,35]. SMR was observed, e.g., in systems of many interacting units such as neurons [36][37][38][39], threshold elements [40], spins in the Ising model [28] and -for narrow intervals of the amplitude of the oscillating signal -in the MV model on regular and small-world lattices [17]. In particular, the so-called structural SMR was reported in the Ising model on certain SF networks [28]. ...
... Over the last few decades, much attention has been devoted to stochastic resonance (SR), which manifests itself as detection and transmission of weaksignals in the presence of noise [1,2] . In a large variety of artificial and natural nonlinear dynamical systems, including physical and biological ones, it is observed that the presence of an optimal level of noise leads to maximal correlation between the input signal and the system's response [3][4][5][6][7][8][9][10][11][12] . However, due to the random nature of noise, researchers have been looking for alternatives to obtain similar signal detection performances. ...
Article
We investigate the phenomenon of vibrational resonance (VR) in neural populations, whereby weak low-frequency signals below the excitability threshold can be detected with the help of additional high-frequency driving. The considered dynamical elements consist of excitable FitzHugh–Nagumo neurons connected by electrical gap junctions and chemical synapses. The VR performance of these populations is studied in unweighted and weighted scale-free networks. We find that although the characteristic network features – coupling strength and average degree – do not dramatically affect the signal detection quality in unweighted electrically coupled neural populations, they have a strong influence on the required energy level of the high-frequency driving force. On the other hand, we observe that unweighted chemically coupled populations exhibit the opposite behavior, and the VR performance is significantly affected by these network features whereas the required energy remains on a comparable level. Furthermore, we show that the observed VR performance for unweighted networks can be either enhanced or worsened by degree-dependent coupling weights depending on the amount of heterogeneity.
... [48][49][50][51][52][53][54][55] However, stochastic multiresonance in excitable neuronal networks is still not deeply studied. 56 In this paper, we devote for studying stochastic multiresonance in a smallworld neuronal network, which is locally modelled by FitzHugh-Nagumo (FHN) neuronal model. And besides additive noise, each single neuron is also stimulated by a weak sinusoid weak signal. ...
Article
In this paper, effects of noise on Watts-Strogatz small-world neuronal networks, which are stimulated by a subthreshold signal, have been investigated. With the numerical simulations, it is surprisingly found that there exist several optimal noise intensities at which the subthreshold signal can be detected efficiently. This indicates the occurrence of stochastic multiresonance in the studied neuronal networks. Moreover, it is revealed that the occurrence of stochastic multiresonance has close relationship with the period of subthreshold signal Te and the noise-induced mean period of the neuronal networks T0. In detail, we find that noise could induce the neuronal networks to generate stochastic resonance for M times if Te is not very large and falls into the interval (M×T0,(M+1)×T0) with M being a positive integer. In real neuronal system, subthreshold signal detection is very meaningful. Thus, the obtained results in this paper could give some important implications on detecting subthreshold signal and propagating neuronal information in neuronal systems.
Chapter
DESCRIPTION The Byzantine Empire—center through a millennium for commerce, culture, and data in the world—was a fertile setup in which bestiaries spread [Kalof and Resl, A Cultural History of Animals in the Medieval Age (1000–1400) (Berg Publishers, Oxford, 2007 Kalof, L., and Resl, B., A Cultural History of Animals in the Medieval Age (Berg Publishers, Oxford, 2007).)]. These are literary images of animals and plants that, together with a variety of fantastic beings, appeared in compendia and beautifully illustrated books, paintings, and bas-reliefs. Often having a moralizing purpose, they meant admitting a symbolic language of nature, and each element had its specific characteristics and function. With a more scientific intent, thus serving the purposes of this book, we now perfect our bestiary, a compendium of phenomena concerning the thing. We thus come to examine further “creatures” of those that have attributes of complexity and criticality together with an intentional irregular dynamics—beasts that, incidentally, may also merit addition to a medieval album. Actually, a contemporary bestiary should include, for example, cellular metabolism; the action of genetic networks; various mechanisms underlying memory, intelligence, and consciousness; the rise, spread, and dissipation of epidemics; the dynamics of the immune system and social revolutions, bird flocks, and fish banks; and the occurrence of large fluctuations such as huge crashes in financial markets and massive failures of power grids. We have already addressed in this book guidelines to explain such a wide and intricate bestiary, aspiring to help to draw “moralizing” conclusions. New phenomena and their interpretations are next reviewed within this context to clarify, as this essay has proposed to do, the essence of the thing.
Article
We investigate the stochastic resonance phenomenon in a discrete Hopfield neural network for transmitting binary amplitude modulated signals, wherein the binary information is represented by two stored patterns. Based on the potential energy function and the input binary signal amplitude, the observed stochastic resonance phenomena involve two general noise-improvement mechanisms. A suitable amount of added noise assists or accelerates the switch of the network state vectors to follow input binary signals more correctly, yielding a lower probability of error. Moreover, at a given added noise level, the probability of error can be further reduced by the increase of the number of neurons. When the binary signals are corrupted by external heavy-tailed noise, it is found that the Hopfield neural network with a large number of neurons can outperform the matched filter in the region of low input signal-to-noise ratios per bit.
Article
Noise is an inherent part of neuronal dynamics, and thus of the brain. It can be observed in neuronal activity at different spatiotemporal scales, including in neuronal membrane potentials, local field potentials, electroencephalography, and magnetoencephalography. A central research topic in contemporary neuroscience is to elucidate the functional role of noise in neuronal information processing. Experimental studies have shown that a suitable level of noise may enhance the detection of weak neuronal signals by means of stochastic resonance. In response, theoretical research, based on the theory of stochastic processes, nonlinear dynamics, and statistical physics, has made great strides in elucidating the mechanism and the many benefits of stochastic resonance in neuronal systems. In this perspective, we review recent research dedicated to neuronal stochastic resonance in biophysical mathematical models. We also explore the regulation of neuronal stochastic resonance, and we outline important open questions and directions for future research. A deeper understanding of neuronal stochastic resonance may afford us new insights into the highly impressive information processing in the brain.
Article
We numerically investigate the transmission of time-modulated random point trains in a conductance-based neuron model by including shot noise described as additive noise trains. The results show that additive noise trains can induce neuron responses exhibiting correlation with the temporally modulated random point trains. In addition, the additive noise power density can be increased up to an optimal value where the output signal-noise ratio (SNR) reaches a maximum value. This property of noise-enhanced transmission of random point trains can be related to the stochastic resonance (SR) phenomenon. More interestingly, we find that the SNR gain can exceed unity and can also be optimized by tuning the average rate of the input random point trains. The present study illustrates the potential to utilize the additive noise and temporally modulated random point trains for optimizing the response of the neuron to inputs, as well as a guidance in the design of information processing devices to random neuron spiking.
Article
Full-text available
Noise in dynamical systems is usually considered a nuisance. But in certain nonlinear systems, including electronic circuits and biological sensory apparatus, the presence of noise can in fact enhance the detection of weak signals. This phenomenon, called stochastic resonance, may find useful application in physical, technological and biomedical contexts.
Article
Computational properties of use to biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.
Article
We review the behavior of theoretical models of excitable systems driven by Gaussian white noise. We focus mainly on those general properties of such systems that are due to noise, and present several applications of our findings in biophysics and lasers. As prototypes of excitable stochastic dynamics we consider the FitzHugh–Nagumo and the leaky integrate-and-fire model, as well as cellular automata and phase models. In these systems, taken as individual units or as networks of globally or locally coupled elements, we study various phenomena due to noise, such as noise-induced oscillations, stochastic resonance, stochastic synchronization, noise-induced phase transitions and noise-induced pulse and spiral dynamics. Our approach is based on stochastic differential equations and their corresponding Fokker–Planck equations, treated by both analytical calculations and/or numerical simulations. We calculate and/or measure the rate and diffusion coefficient of the excitation process, as well as spectral quantities like power spectra and degree of coherence. Combined with a multiparametric bifurcation analysis of the corresponding cumulant equations, these approaches provide a comprehensive picture of the multifaceted dynamical behaviour of noisy excitable systems.
Article
The stochastic resonance phenomenon has been observed in many forms of systems and has been debated by scientists for 30 years. Applications incorporating aspects of stochastic resonance have yet to prove revolutionary in fields such as distributed sensor networks, nano-electronics, and biomedical prosthetics. The initial chapters review stochastic resonance basics and outline some of the controversies and debates that have surrounded it. The book continues to discuss stochastic quantization in a model where all threshold devices are not necessarily identical, but are still independently noisy. Finally, it considers various constraints and tradeoffs in the performance of stochastic quantizers. Each chapter ends with a review summarizing the main points, and open questions to guide researchers into finding new research directions. © M. D. McDonnell, N. G. Stocks, C. E. M. Pearce and D. Abbott 2008 and Cambridge University Press, 2009.
Article
Preface; 1. Introduction; 2. Driven lattice gases: simulations; 3. Driven lattice gases: theory; 4. Lattice gases with reaction; 5. Catalysis models; 6. The contact process; 7. Models of disorder; 8. Conflicting dynamics; 9. Particle reaction models; Bibliography; Index.
Article
Power spectrum densities for the number of tick quotes per minute (market activity) on three currency markets (USD/JPY, EUR/USD, and JPY/EUR) are analyzed for periods from January 2000 to December 2000. We find some peaks on the power spectrum densities at a few minutes. We develop the double-threshold agent model and confirm that the corresponding periodicity can be observed for the activity of this model even though market participants perceive common weaker periodic information than threshold for decision-making of them. This model is numerically performed and theoretically investigated by utilizing the mean-field approximation. We propose a hypothesis that the periodicities found on the power spectrum densities can be observed due to nonlinearity and diversity of market participants.
Article
Recent developments in seismology, ultrasonics, and underwater acoustics have led to a radical change in the way scientists think about ambient noise - the diffuse waves generated by pressure fluctuations in the atmosphere, the scattering of water waves in the ocean, and any number of other sources that pervade our world. Because diffuse waves consist of the superposition of waves propagating in all directions, they appear to be chaotic and random. That appearance notwithstanding, diffuse waves carry information about the medium through which they propagate.
Article
Stochastic resonance is investigated in the Ising model with ferromagnetic coupling on scale-free networks with various scaling exponents gamma ge 2 of the degree distributions p(k)propto k-gamma , subjected to a weak oscillating magnetic field. In the case 2 le gamma le 3 and for slow to moderate frequencies of the input signal the linear response theory and numerical simulations in the mean-field approximation predict the occurrence of stochastic multiresonance, with the spectral power amplification as a function of temperature exhibiting double maxima in the vicinity of and below the crossover temperature for the ferromagnetic transition. In the case gamma ge 3 the spectral power amplification is expected to exhibit single maximum close to the critical temperature. These predictions are qualitatively confirmed by Monte Carlo simulations of the Ising model on scale-free networks obtained from a preferential attachment growing procedure.
Article
Although signaling between neurons is central to the functioning of the brain, we still do not understand how the code used in signaling depends on the properties of synaptic transmission. Theoretical analysis combined with patch clamp recordings from pairs of neocortical pyramidal neurons revealed that the rate of synaptic depression, which depends on the probability of neurotransmitter release, dictates the extent to which firing rate and temporal coherence of action potentials within a presynaptic population are signaled to the postsynaptic neuron. The postsynaptic response primarily reflects rates of firing when depression is slow and temporal coherence when depression is fast. A wide range of rates of synaptic depression between different pairs of pyramidal neurons was found, suggesting that the relative contribution of rate and temporal signals varies along a continuum. We conclude that by setting the rate of synaptic depression, release probability is an important factor in determining the neural code.