Short-term plasticity optimizes synaptic information transmission.
ABSTRACT Short-term synaptic plasticity (STP) is widely thought to play an important role in information processing. This major function of STP has recently been challenged, however, by several computational studies indicating that transmission of information by dynamic synapses is broadband, i.e., frequency independent. Here we developed an analytical approach to quantify time- and rate-dependent synaptic information transfer during arbitrary spike trains using a realistic model of synaptic dynamics in excitatory hippocampal synapses. We found that STP indeed increases information transfer in a wide range of input rates, which corresponds well to the naturally occurring spike frequencies at these synapses. This increased information transfer is observed both during Poisson-distributed spike trains with a constant rate and during naturalistic spike trains recorded in hippocampal place cells in exploring rodents. Interestingly, we found that the presence of STP in low release probability excitatory synapses leads to optimization of information transfer specifically for short high-frequency bursts, which are indeed commonly observed in many excitatory hippocampal neurons. In contrast, more reliable high release probability synapses that express dominant short-term depression are predicted to have optimal information transmission for single spikes rather than bursts. This prediction is verified in analyses of experimental recordings from high release probability inhibitory synapses in mouse hippocampal slices and fits well with the observation that inhibitory hippocampal interneurons do not commonly fire spike bursts. We conclude that STP indeed contributes significantly to synaptic information transfer and may serve to maximize information transfer for specific firing patterns of the corresponding neurons.
- SourceAvailable from: Nicolas Yvan Masse[show abstract] [hide abstract]
ABSTRACT: We examined how hippocamal CA1 neurons process complex time-varying inputs that dendrites are likely to receive in vivo. We propose a functional model of the dendrite-to-soma input/output relationship that combines temporal integration and static-gain control mechanisms. Using simultaneous dual whole cell recordings, we injected 50 s of subthreshold and suprathreshold zero-mean white-noise current into the primary dendritic trunk along the proximal 2/3 of stratum radiatum and measured the membrane potential at the soma. Applying a nonlinear system-identification analysis, we found that a cascade of a linear filter followed by an adapting static-gain term fully accounted for the nonspiking input/output relationship between the dendrite and soma. The estimated filters contained a prominent band-pass region in the 1- to 10-Hz frequency range that remained constant as a function of stimulus variance. The gain of the dendrite-to-soma input/output relationship, in contrast, varied as a function of stimulus variance. When the contribution of the voltage-dependent current I(h) was eliminated, the estimated filters lost their band-pass properties and the gain regulation was substantially altered. Our findings suggest that the dendrite-to-soma input/output relationship for proximal apical inputs to CA1 pyramidal neurons is well described as a band-pass filter in the theta frequency range followed by a gain-control nonlinearity that dynamically adapts to the statistics of the input signal.Journal of Neurophysiology 12/2007; 98(5):2943-55. · 3.30 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: Synaptic efficacy can increase (synaptic facilitation) or decrease (synaptic depression) markedly within milliseconds after the onset of specific temporal patterns of activity. Recent evidence suggests that short-term synaptic depression contributes to low-pass temporal filtering, and can account for a well-known paradox - many low-pass neurons respond vigorously to transients and the onsets of high temporal-frequency stimuli. The use of depression for low-pass filtering, however, is itself a paradox; depression induced by ongoing high-temporal frequency stimuli could preclude desired responses to low-temporal frequency information. This problem can be circumvented, however, by activation of short-term synaptic facilitation that maintains responses to low-temporal frequency information. Such short-term plasticity might also contribute to spatio-temporal processing.Trends in Neurosciences 08/2001; 24(7):381-5. · 13.58 Impact Factor
- [show abstract] [hide abstract]
ABSTRACT: In vivo whole-cell recordings revealed that during repeated stimulation, synaptic responses to deflection of facial whiskers rapidly adapt. Extracellular recordings in the somatosensory thalamus revealed that part of the adaptation occurs subcortically, but because cortical adaptation is stronger and recovers more slowly, cortical mechanisms must also contribute. Trains of sensory stimuli that produce profound sensory adaptation did not alter intrinsic membrane properties, including resting membrane potential, input resistance, and current-evoked firing. Synaptic input evoked via intracortical stimulation was also unchanged; however, synaptic input from the somatosensory thalamus was depressed by sensory stimulation, and this depression recovered with a time course matching that of the recovery of sensory responsiveness. These data strongly suggest that synaptic depression of thalamic input to the cortex contributes to the dynamic regulation of neuronal sensitivity during rapid changes in sensory input.Neuron 05/2002; 34(3):437-46. · 15.77 Impact Factor
broadband, i.e., frequency independent. Here we developed an analytical approach to quantify time- and rate-dependent synaptic
found that STP indeed increases information transfer in a wide range of input rates, which corresponds well to the naturally occurring
the presence of STP in low release probability excitatory synapses leads to optimization of information transfer specifically for short
release probability synapses that express dominant short-term depression are predicted to have optimal information transmission for
Short-term plasticity (STP) acts on millisecond-to-minute time-
scales to modulate synaptic strength in an activity-dependent
manner. STP is widely believed to play an important role in syn-
aptic computations and to contribute to many essential neural
functions, particularly information processing (Abbott and Re-
gehr, 2004; Deng and Klyachko, 2011). Specific computations
performed by STP are often based on various types of filtering
operations (Fortune and Rose, 2001; Abbott and Regehr, 2004).
This generally accepted role of STP in information processing has
been challenged recently by several computational studies aimed at
information-theoretic framework (Lindner et al., 2009; Yang et al.,
2009). Although earlier studies have shown that information trans-
fer is dependent on release probability (Zador, 1998), which is di-
rectly modified by STP, Lindner et al. (2009) used a generalized
model of STP to show that information transfer by dynamic syn-
dominant facilitation or depression. Similar results obtained using
that information transmission is predominately broadband
(Yang et al., 2009). These studies suggested that STP does not
contribute to frequency-dependent information filtering and
raised the question of what specific roles STP plays in synaptic
One common feature of these computational studies, how-
ever, is that they considered only steady-state conditions that
synapses reach after prolonged periods of high-frequency stimu-
lation. Although this is a physiologically plausible condition, it
reflects the strained state of the synapses when significant
amounts of their resources, such as the readily releasable pool
for instance, of excitatory CA3–CA1 synapses, which typically
experience rather short 15- to 25-spike-long high-frequency
bursts separated by relatively long periods of lower activity (Fen-
used at any time during such naturalistic activity (Kandaswamy
et al., 2010) with nearly complete RRP recovery between the
bursts. This discrepancy between the strained state of synapses
used in analytical calculations of information transmission and
the realistic state of the synapses during natural activity suggests
that more representative conditions need to be considered to
evaluate STP contributions to information processing.
We therefore developed an analytical approach to calculate
the time and rate dependence of synaptic information transmis-
Author contributions: Z.R. and V.A.K. designed research; Z.R. and P.-Y.D. performed research; Z.R. and V.A.K.
14800 • TheJournalofNeuroscience,October12,2011 • 31(41):14800–14809
sion using a realistic model of STP in excitatory hippocampal
synapses. Indeed, using more realistic conditions, we found that
STP contributes significantly to increasing information transfer
over a wide frequency range. Furthermore, our time-dependent
analysis indicated that STP optimizes information transmission
specifically for short high-frequency spike bursts in low release
probability synapses, and that this optimization shifts from
bursts to single spikes in high release probability synapses. We
verified these predictions using recordings in excitatory and in-
hibitory hippocampal synapses with corresponding properties.
transmission within the information-theoretic framework and
shows that STP works to optimize information transmission for
specific firing patterns of the corresponding neurons.
Animals and slice preparation. Horizontal hippocampal slices (350 ?m)
were prepared from 15- to 25-d-old mice using a vibratome (VT1200 S,
Leica). Both male and female animals were used for recordings. Dissec-
tions were performed in ice-cold solution that contained the following
(in mM): 130 NaCl, 24 NaHCO3, 3.5 KCl, 1.25 NaH2PO4, 0.5 CaCl2, 5.0
MgCl2, and 10 glucose, saturated with 95% O2and 5% CO2, pH 7.3.
Slices were incubated in the above solution at 35°C for 1 h for recovery
and then kept at room temperature (?23°C) until use. All animal pro-
cedures conformed to the guidelines approved by the Washington Uni-
versity Animal Studies Committee.
Electrophysiological recordings. Whole-cell patch-clamp recordings
were performed using an Axopatch 200B amplifier (Molecular Devices)
from CA1 pyramidal neurons visually identified with infrared video mi-
croscopy (BX50WI, Olympus; Dage-MTI) and differential interference
contrast optics. All recordings were performed at near-physiological
temperatures (33–34°C). The recording electrodes were filled with the
following (in mM): 130 K-gluconate, 0.5 EGTA, 2 MgCl2, 5 NaCl, 2
ATP2Na, 0.4 GTPNa, and 10 HEPES, pH 7.3. The extracellular solution
contained the following (in mM): 130 NaCl, 24 NaHCO3, 3.5 KCl, 1.25
AP5 (50 ?m) to prevent long-term effects. EPSCs were recorded from
Schaffer collaterals with a bipolar electrode placed in the stratum radia-
tum ?300 ?m (range ?200–500 ?m) from the recording electrode.
EPSCs recorded in this configuration represent an averaged synaptic
response across a population of similar CA1–CA3 synapses. The same
recording configuration was previously used to provide experimental
desensitization and saturation are insignificant, and voltage-clamp er-
the relevant frequency range.
software written in LabVIEW, and analyzed using programs written in
MATLAB. EPCSs during the stimulus trains were normalized to an av-
erage of five low-frequency (0.2 Hz) control responses preceding each
rated by ?2 min of low-frequency (0.2–0.1 Hz) control stimuli to allow
was created for each stimulus presentation by averaging all EPSCs within a
given train that were separated by at least 100 ms from their neighbors and
normalized to their peak values. Every EPSC in the train then was approxi-
The natural stimulus trains used in this study represent the firing
patterns recorded in vivo from the hippocampal place cells of awake,
Spikes with ISIs ?10 ms were treated as a single stimulus, because the
delay between the action potential firing and the peak of postsynaptic
currents/potentials prevented resolution of individual synaptic re-
sponses at shorter ISIs. Such treatment does not significantly affect syn-
aptic responses to natural stimulus trains, as we have shown previously
(Klyachko and Stevens, 2006a).
Analytical framework for analysis of information transmission by dy-
namic synapses. Information theory provides a general framework to
quantify information transfer in any system based on the principles of
Shannon (1948). Our approach described below is an extension of the
of STP to information transfer based on these principles. Synaptic infor-
mation transmission can be measured by how much information the
output spike train provides about the input train, which is termed “mu-
tual information” (Shannon, 1948). Within the information-theoretic
H?x? ? ??P?xi?log2?P?xi??,
where P?xi? is the probability of variable x to have the value xi. The
synaptic mutual information Imdepends on the input (also termed
“source”) spike train’s entropy, H?s?, the entropy of output spike trains
(or of synaptic responses) H?r?, and their joint entropy H?r, s?, and by
definition is given by the following:
Im ? H?r? ? H?s? ? H?r, s? ? H?r? ? H?r ? s?, (2)
where H?r ? s? is a conditional entropy of the output given the inputs,
input. In practical terms, this means that variability of synaptic output
for the multiple presentations of the same input represents an inherent
“noise” in transmission and does not carry information, because it can-
not distinguish between two different inputs.
The realistic model of STP we used to describe synaptic dynamics in
the hippocampal synapses (Kandaswamy et al., 2010) is formulated to
predict changes in synaptic release probability during a random spike
input. The term “release probability” (Pr) is commonly used to describe
probability of vesicle release given a presynaptic spike. If we describe the
existing/nonexisting presynaptic spike as s ? 1/0, and denote a vesicle
that is released/not released as r ? 1/0, then the release probability is
Pr? P?r ? 1 ? s ? 1?.Inourcalculationswewilldistinguishbetween
the term Pr, the synaptic release probability, and another probability
variable, p ? P?r ? 1?, which simply represents the probability of
synaptic response at a given time. The relation between these two vari-
ables is determined by the stimulation rate, p ? R ? Pr, where R is the
presynaptic firing rateP?s ? 1?. The advantage of chosen STP formula-
tion is that it allows direct comparison to experimental measurements
(Kandaswamy et al., 2010) and provides a useful framework for the cal-
as an input to the STP model, a resulting distribution of release proba-
bilities, f?p, t?, at each point in time can be calculated. The output of
individual synapses is determined by the release of a vesicle, which is
response (r) is thus a binary-state system at each point in time. Applica-
tion of Equation 1 to calculate mutual information for this simplified
binary-state system gives the following:
H ? ? plog2?p? ? ?1 ? p?log2?1 ? p?, (3)
to be in the other (notice that the expression is symmetrical regarding
assignment of the two states ?p 3 1 ? p?. Also note that the above
formulation is derived in assumption of a monovesicular release. Our
previous computational analyses (Kandaswamy et al., 2010) indicated
Rotmanetal.•InformationTransferbySynapticDynamics J.Neurosci.,October12,2011 • 31(41):14800–14809 • 14801
that models of STP in hippocampal synapses described the experimental
data equally well in assumption of either a monovesicular release or
multivesicular release in case the number of active release sites does not
change significantly during elevated activity levels. Given these previous
results, and because the extent of multivesicular release in hippocampal
and release probability have not been established, we will limit our anal-
ysis to the monovesicular release framework.
For synaptic transmission, the entropy of vesicle release and thus of
synaptic response, H?r?, is determined by the release probability of the
synaptic ensemble, and is given by the following:
H?r? ? ? R?Pr?log2?R ? ?Pr?? ? ?1 ? R ? ?Pr??
log2?1 ? R ? ?Pr??,(4)
where ?. . .? denotes the ensemble averaged value, and R is the input
To calculate the second term, H?r ? s?, in Equation 2, we take into
p is drawn from the distribution. Therefore, the expression for H?r ? s? is
the average of Equation 3 with the distribution function f?p, t?. This
means that for each point in time a different distribution f?p, t? is
and then the binary-state entropy (Eq. 3) is calculated for that time
point. The resulting expression for conditional entropy is then given
by the following:
H?r ? s? ??f?p, t?? ? plog2?p? ? ?1 ? p?log2?1 ? p??dp.
Note that the averaging in Equation 5 is the ensemble averaging over
available input spike trains. Expression 5 can be further simplified by
noticing that if p ? 0 is randomly selected, then H(r) in Equation 4 is
exactly 0; then we can exclude the case of no stimulation from Equa-
distribution function f?p, t? as a sum of two contributions given by
f?p, t? ? ?1 ? R??p,0 ? R ? f˜?p ? s ? 1, t?,(6)
where the first component represents a contribution when there is no
stimulation and the second component represents a contribution when
stimulation is present. From Equations 5 and 6 a simplified expression
for the conditional entropy can thus be derived as follows:
H?r ? s? ? ? R ??f˜?p, t??plog2?p? ? ?1 ? p?log2?1 ? p??dp.
In an individual input train, there will be R stimuli per unit of time, and
each of these will contribute according to the release probability at that
time. In the case of ensemble entropy, each time point contributes
that time point. The main difference between full ensemble entropy and
conditional entropy is then determined by the choice of specific firing
The mutual information, a measure of transferred information, can
now be simply calculated using Equations 4 and 7 as follows:
Im?t? ? ? R?Pr?log2?R ? ?Pr?? ? ?1 ? R ? ?Pr??log2?1 ? R ? ?Pr??
? R ??f˜?p, t??p log2?p? ? ?1 ? p?log2?1 ? p??dp. (8)
To verify this derivation, we can check our formalism for a few simple,
time-independent cases. For the perfectly reliable synapse (Pr? 1),
f˜?p, t? ? ??p ? 1?, which leads to zero conditional entropy. This is
by the input, and although different trains produce different responses,
single stimulation produces only one and the same single response.
The mutual information is then the full entropy of the input given by
Im? H?r? ? H?s? ? ? Rlog2?R? ? ?1 ? R?log2?1 ? R?.
is only one response to any input. When?Pr? ? 0 is used in Equation 4,
the resulting entropy is 0.
For any other constant release probability Pr? p0, we can calculate
the rate-dependent entropy as H?r? ? ? ?p0? R?log2?p0? R? ?
?1 ? p0? R?log2?1 ? p0? R?. The conditional entropy has a sim-
plified expression in this case, because the integration of Equation 7 can
be performed exactly as follows: H?r?s? ? ?R??p0?log2?p0? ? ?1 ?
p0?log2(1 ? p0?].
Because all of the above analysis was performed for a single point in
time (binned time), the resulting Equation 8 thus measures the mutual
information per unit time. We can also define mutual information per
spike by dividing Equation 8 by the stimulation rate R and define the
average cumulative mutual information as the following:
T ? R?
which measures the average information transfer per unit time,
within a period of time from 0 to T. It is important to point out that in
the case of a static, constant Prsynapse mutual information and av-
erage cumulative mutual information are exactly the same because
they are time independent.
Note that the exact values of bits of information transmitted are de-
pendent on the chosen bin width of time and the release probability, so
that if we assume a more precise Pror spike-timing measurements, their
information contents will increase. We will therefore consider relative
information transmission in a model of a dynamic synapse to informa-
tion transmission by a synapse with a constant Pr(i.e., no STP); both are
analyzed using exactly the same procedures.
single vesicle. Because there are no clearly established mechanisms that
control these processes, they can only be modeled as randomly distributed
effects. All such processes therefore would not contribute to information
Computational analysis. For simulation of the effects of STP on the
information transfer of the CA3–CA1 excitatory synapse, we used our
with experimental measurements at this synapse in rat hippocampal
slices (Kandaswamy et al., 2010). This model accounts for three compo-
nents of short-term synaptic enhancement (two components of facilita-
tion and one component of augmentation) and depression, which is
modeled as the depletion of the ready releasable vesicle pool using a
sequential two-pool model. To determine the model parameters in a
wide frequency range, we performed an extensive set of recordings of
synaptic responses in the CA1 neurons in mouse hippocampal slices at
stimulus frequencies of 2–100 Hz. Model parameters were then deter-
daswamy et al. (2010). The model was then able to successfully predict
synaptic responses for arbitrary stimulus patterns.
In the first part of our study, constant-rate Poissonian spike trains
were used. An ensemble of 6400 short trains was simulated for each rate.
Train duration was chosen to achieve the ensemble average number of
spikes in the train equal to 100. This timescale was chosen to match the
14802 • J.Neurosci.,October12,2011 • 31(41):14800–14809Rotmanetal.•InformationTransferbySynapticDynamics
the quantitative information theory analysis requires discretization of
steps of 3 ms. Time steps were chosen to limit the minimum allowed ISI
to experimentally and physiologically realizable cases.
Careful experimental analysis of synaptic information transmis-
sion requires testing a prohibitively large number of possible in-
information transfer are not currently feasible. Rather, studies of
synaptic information transmission are performed mostly by com-
puter simulations using models of synaptic dynamics (Markram et
al., 1998b; Zador, 1998; Fuhrmann et al., 2002; Silberberg et al.,
2004b; Lindner et al., 2009; Yang et al., 2009). Computationally,
synaptic information transmission can be estimated within the
information-theoretic framework by calculating the mutual infor-
mation (Shannon, 1948) that reflects how much information the
output spike train provides about the input train. To examine the
an analytical approach to calculate both the rate and time depen-
dence of mutual information in a dynamic synapse in terms of the
entropy of the synaptic response itself H(r) (Eq. 4) and the condi-
approach is an extension of the earlier formalism originally devel-
To examine synaptic dynamics that closely approximate the
experimental data, we derived the entropy terms as a function of
the input spike rate and synaptic release probability (Eq. 8). The
release probability during input spike trains was determined
excitatory hippocampal synapses (Kandaswamy et al., 2010). To
frequency range, we performed a set of
recordings of synaptic responses in the
CA1 neurons in mouse hippocampal
The observed synaptic dynamics closely
followed our previous recordings in the
rat slices (Klyachko and Stevens, 2006a,b)
(data not shown). Parameters of the
model were determined as we previously
described (Kandaswamy et al., 2010).
With this optimal set of parameters, the
model has been shown to accurately pre-
dict all key features of synaptic dynamics
ploring rodents (Kandaswamy et al.,
2010). The basal Prvalue in the model
was set to 0.2, which represents the
(Murthy et al., 1997). It is important to
note that although Pris typically low in
these synapses, it is distributed across a
significant range of values in the popula-
tion of hippocampal synapses. We there-
in our first set of calculations, but then
performed a detailed robustness analysis
of the information transmission and of
our results as a function of all major
model parameters, including the range of
Prvalues from 0.05 to 0.4, which includes a large proportion of
the synaptic population (Dobrunz and Stevens, 1997; Murthy
et al., 1997) (see text and Fig. 4).
the steady-state conditions that synapses reach after prolonged
high-frequency stimulation (Lindner et al., 2009; Yang et al.,
conditions might not be fully representative of the state in which
synapses operate during natural activity levels, the contribution
yses if this contribution has a strong temporal component. We
thus focused on deriving and using a time-dependent formalism
to capture such time-dependent effects.
We first applied our time-dependent formalism to examine
information transmission by a dynamic synapse during
constant-rate, Poisson-distributed spike trains. As expected,
the information transmission showed a clear dependence on the
input rate (Fig. 1) similarly to the previous report (Zador, 1998).
Most importantly, we found that information transfer was
greater in the presence of STP than for the constant basal Pr(i.e.,
no STP present) for a wide range of input rates, ?1–40 Hz (Fig.
1A) and the mutual information per unit of time (Fig. 1B). At
low input rates, 0.01 ? R ? 0.1, at which STP contribution is
small and does not significantly alter release probability, infor-
mation transmission follows the same line as that for constant
transfer grows faster in the presence of STP, reaching levels that
nearly double information transmission at basal Prvalue. The
range of input rates at which STP contributes to information
Icumulativeare exactly the same since they are time independent. The presence of STP increases information transferred by the
Rotmanetal.•InformationTransferbySynapticDynamics J.Neurosci.,October12,2011 • 31(41):14800–14809 • 14803
transfer is comparable to the range of fre-
quencies found in natural spike trains
(Fenton and Muller, 1998; Leutgeb et al.,
2005). Thus, STP clearly increases infor-
ner, unlike the findings in previous
reports (Lindner et al., 2009; Yang et al.,
a realistic model of STP that closely ap-
proximates synaptic dynamics in excit-
atory hippocampal synapses and in part
from performing time-dependent analy-
sis. The latter has shown that the steady-
analyzed in information transmission
studies may not be representative, at least
in the case of the hippocampal excitatory
during physiologically relevant activity.
ing, then the time dependence of the in-
formation transfer may determine the
optimal length and structure of the input
of information transmission for trains of
at all rates above ?16 Hz. Lower rates
showed an increase toward the steady-
these conditions mutual information
grew with rate independently of the train
length. Higher rates and long train lengths showed convergence
to a common universal value, suggesting that under the condi-
tions approaching the steady state the mutual information is
broadband, as previously reported (Yang et al., 2009). The same
was examined for different input rates (Fig. 2B). For cumulative
as expected when we factored in the added contribution from
the several initial spikes that occurred when information
transfer was low.
To evaluate the benefits of such optimization, we compared
mutual information for the optimal train length at a given firing
values. In the case of dynamic synapses with a basal Pr? 0.2,
information transmission for the optimal length spike train was
equivalent to that of a static synapse with twice higher Pr? 0.4
(Fig. 2C). Together, these results show that STP in low release
probability excitatory synapses not only increases information
transmission in a rate-dependent manner, but it also leads to
optimization of information transfer for short spike bursts that
are indeed commonly observed in excitatory hippocampal neu-
rons (Leutgeb et al., 2005).
To examine whether these information transmission principles
ing natural spike patterns recorded in hippocampal place cells of
freely moving and exploring rodents (Fenton and Muller, 1998).
These spike trains represent the patterns of inputs that the
excitatory hippocampal synapses are likely to encounterinvivo
(Leutgeb et al., 2005). To be able to apply our formalism to an
arbitrary spike train with varying rates, we needed to transform
the input train into a time-dependent rate r(t) and produce a
analysis of information transfer very time intensive. The analysis
of information transfer can be simplified, however, by eliminat-
ing the need for ensemble measurements if the expression for
sured values, such as Pr. We thus used an approximation for the
conditional entropy expression in Equation 7 by replacing the
averaging over values of release probability p, by its average, i.e.,
?Pr?, as follows:
H?r ? s? ? ? R??Pr?log2??Pr?? ? ??1 ? ?Pr??log2?1 ? ?Pr????.
paring the exact amount of information transfer (given by Eq. 8)
forconstant-ratetrainsusingthepreciseexpressionforH?r ? s?in
Equation 7 versus its approximation in Equation 10. This ap-
proximation resulted in 95% accuracy or better in estimating
information transfer for stimulus trains shorter than ?40 spikes
synapse with a basal Pr? 0.2 is plotted versus spike number in the train. Information transmission is optimal for the short
optimal train length shifted toward the larger number of spikes, but the same overall optimization behavior is seen. Numbers
If optimal train length is chosen, the dynamic synapse with a basal Pr? 0.2 can transfer information as efficiently as a static
synapse with Pr? 0.4. D, Peak position and width (calculated as a half-width above the steady-state level) of the average
Time dependence of synaptic information transmission. A, Time-dependent mutual information for a dynamic
14804 • J.Neurosci.,October12,2011 • 31(41):14800–14809Rotmanetal.•InformationTransferbySynapticDynamics
rates ?56 Hz (Fig. 3A). The only input regimes in which larger
deviations were seen were outside the physiologically relevant
range of stimuli for these synapses. The accuracy of this approx-
imation suggests that the entropy held in the distribution f?p, t?
of p values is relatively small with respect to the main contribu-
is not surprising considering that the probability of an action
potential firing at any given time point is very small. This notion
can be seen more easily using a simple example at a 1 Hz rate.
all having the 1 Hz rate. The synapse can release with 1 of 10
possible values of release probability (since it is quantized with
0.1 steps), and in reality the values are constrained by the model
so the actual spectrum is even smaller. This explains why in our
approximation the variability arising from the spike timing is
much greater than that of the release probability distribution.
Estimation of information transfer during a natural spike train
using this simplification is based on the assumption that the
of an ensemble of natural spike trains and that variability within
this ensemble is relatively small. Under this assumption, our
analysis shows that information transfer by a dynamic synapse
increases several-fold during spike bursts in the presence of STP
(Fig. 3B,C). The synaptic information transfer due to STP in
a static synapse with Pr? 0.4. This effect of STP is very similar to
the results seen for the optimal length spike train (Fig. 2C), sug-
gesting that natural spike trains in hippocampal neurons may be
optimized to transmit maximal information given the specific
synapse with a basal release probability of Pr? 0.4 is more likely
Murthy et al., 1997), leading to an overall decrease of transferred
information (see Fig. 5 and text below). It is thus the tuning
between the natural spike train structure and the dynamic prop-
erties of excitatory hippocampal synapses that allows the en-
hancement of information transfer during natural spike trains.
Although hippocampal excitatory synapses have a low average
ulation (Murthy et al., 1997). The expression of individual STP
components is interdependent with the release probability and
to what extent our findings are robust regarding changes in re-
lease probability as well as in individual model parameters. We
thus performed the same analysis as described above for a range
to ?2) in facilitation amplitude, augmentation amplitude, the
time course of RRP recovery that effectively controls depression
fer for spike bursts, were not strongly dependent on the model
This analysis also allowed us to examine the roles of different
forms of STP in optimization of information transfer. Specifi-
cally, we used three metrics to quantify information transfer op-
timization: the peak position (Fig. 4A), the peak width (Fig. 4B),
and the peak height (Fig. 4C). We found that the largest decrease
in both the peak position and width occurred when facilitation
use of synaptic resources (vesicles) leading to faster and stronger
time was decreased (Fig. 4A), increasing vesicle availability and
(Fig. 4B) by effectively extending vesicle availability and thus
delaying the onset of depression. We further considered the con-
represents one way of quantifying optimization strength. Our
from the facilitation amplitude. This effect of facilitation is ex-
synaptic dynamics is indeed dominated by facilitation.
imation for mutual information from exact numerical calculations. The approximation pre-
sented holds true with 95% accuracy for all firing rates between 0.01 and 56 Hz and train
The approximation accuracy is reduced when the model is stressed to the point when the
release probability during prolonged high-rate stimulation approaches zero. B, The average
cumulative mutual information Icumulativefor a natural spike train. Icumulativeshows rapid
Rotmanetal.•InformationTransferbySynapticDynamics J.Neurosci.,October12,2011 • 31(41):14800–14809 • 14805
We also examined the robustness of
changes in basal release probability in a
range from 0.05 to 0.4 (Fig. 4D), which
includes the majority of excitatory hip-
pocampal synapses (Murthy et al., 1997).
While the optimization of information
transmission was observed at all Prvalues
dependence between optimization and Pr
such that the optimal length of the bursts
35 spikes at Pr? 0.05, to 11 spikes at Pr?
to note that this analysis represents the
lower bound approximation in the sense
that in real synapses this dependence be-
tween the optimal peak position and Pris
likely to be even stronger. This is because
most of the current STP model parame-
ters have been determined from experi-
mental data that represent the averaged
behavior of CA3–CA1 synapses, i.e., the
synapse with a Pr? 0.2. Since functional
rameters are not currently known, per-
forming this analysis at significantly
higher Prvalues would require determin-
ing a new set of model parameters based
on the experimental data recorded at
these increased Pr. As Prvalue increases,
the experimentally determined amplitudes of facilitation and
augmentation would decrease and amplitude of depression
would increase. These indirect effects of increasing the Prwould
further accentuate the dependence of optimization on release
probability, but they are not taken into account in our current
time. Based on these considerations, we limited our analysis to a
lower range of Prvalues (0.05–0.4, the range within which syn-
facilitation- to depression-dominated mode at higher Prvalues
(Dobrunz and Stevens, 1997).
Together, these results indicate that STP-mediated optimiza-
tion of information transmission in unreliable synapses is robust
within a relevant range of model parameters and within a lower
range of release probabilities that are predominant in excitatory
The above analysis suggests that STP-mediated optimization of in-
formation transmission for spike bursts holds for unreliable syn-
which are expected to have a dominant short-term depression. In-
? 0.5 (and no facilitation/augmentation) shows strong monoto-
nous decay of average cumulative mutual information with the in-
put rate (Fig. 5A). Even at 2 Hz, depressing synapse with a Pr? 0.5
transfers less information during a 150-spike-long train than the
static synapse with Pr? 0.4, and at 40 Hz the dynamic synapse
Based on the above analysis, we predicted that in high release
probability synapses single spikes rather than bursts would be
expected to carry maximal information as the optimal burst
length would approach a value of 1 (Fig. 4D). To verify this
prediction, we took advantage of the fact that a large proportion
of inhibitory hippocampal synapses in the CA1 area have a high
release probability (Mody and Pearce, 2004; Patenaude et al.,
2005) and express dominant short-term depression (Maccaferri
et al., 2000). To examine information transfer in these synapses,
we used a series of measurements we previously performed in
CA1 inhibitory hippocampal synapses with constant-frequency
values of synaptic strength during trains were used in these cal-
culations. We found that the average cumulative mutual infor-
no optimization peak was observed at all frequencies examined
(Fig. 5B). This result confirms the prediction of our analysis and
mation transfer would take place when the train is composed of
that inhibitory hippocampal interneurons, unlike excitatory py-
ramidal cells, do not typically fire spike bursts (Connors and
We have examined the role of synaptic dynamics in information
aptic drive and the output synaptic gain changes in a realistic
model of STP in excitatory hippocampal synapses. Our analysis
shows that the presence of STP leads to an increase in informa-
tion transfer in a wide frequency range. Furthermore, consider-
ations of the time dependence of information transmission
revealed that STP also determines the optimal number of spikes
in a train that maximizes information transmission. Specifically,
Robustness of information transmission optimization. A, Changes in peak width of average cumulative mutual
14806 • J.Neurosci.,October12,2011 • 31(41):14800–14809 Rotmanetal.•InformationTransferbySynapticDynamics
in these low release probability synapses, information transmis-
sion is optimal for the short high-frequency spike bursts that are
indeed common in the firing patterns of excitatory hippocampal
neurons. When an optimal spike pattern is used as an input, the
analysis further showed strong dependence of this optimization
will reach unity (so that a single spike is optimal for information
apses in brain slices. Our findings thus demonstrate that STP con-
tributes significantly to synaptic information processing and works
to optimize information transmission for specific firing patterns of
The function of STP in information processing has been sug-
gested by numerous studies of visual and auditory processing
(Chance et al., 1998; Taschenberger and von Gersdorff, 2000;
Chung et al., 2002; Cook et al., 2003; DeWeese et al., 2005;
MacLeod et al., 2007) and of cortical/hippocampal circuit oper-
ations (Abbott et al., 1997; Markram et al., 1998a; Silberberg et
al., 2004a; Klyachko and Stevens, 2006a; Kandaswamy et al.,
2010). Specific computations performed by STP are often based
not limited to, detection of transient inputs, such as spike bursts
(Lisman, 1997; Richardson et al., 2005; Klyachko and Stevens,
2006a) and abrupt changes in input rate (Abbott et al., 1997;
(Abbott et al., 1997), input redundancy
reduction (Goldman et al., 2002), and pro-
cessing of population bursts (Richardson
et al., 2005).
Information theory provides a robust
quantitative framework to analyze the
several studies of synaptic processing
(Tsodyks and Markram, 1997; Varela et
al., 1997; Markram et al., 1998b; Tsodyks
et al., 1998; Zador, 1998; Maass and
Zador, 1999; Natschla ¨ger et al., 2001;
Fuhrmann et al., 2002; Goldman et al.,
2002; Loebel and Tsodyks, 2002). The
main complication of applying infor-
mation theory to address physiological
questions is its reliance on the analysis
of large ensembles of input spike pat-
terns, which require either prohibitively
ment to simplifying assumptions. In the
lations based on ISI distribution were
used to significantly reduce the number
of simulations needed. This simplifica-
tion assumes the time independence of
proximation of steady-state synaptic
conditions. This methodology, how-
ever, does not allow the correct analysis
of time-dependent information transmission by dynamic syn-
apses with rapidly changing release probability. By developing
an extension of this previous information theory formalism to
include time-dependent analysis, we were able to clearly demon-
strate the role of STP in increasing information transfer in a wide
conditions that were used in previous studies of information trans-
and indeed led to different conclusions. Both studies, however, as-
we found obscured the contributions of STP, which have a strong
temporal component. In fact, we have shown, in agreement with
Yang et al. (2009), that for the significantly long trains, when syn-
apses reach a steady state, information transmission indeed con-
verges to the same unifying level, and there is a wide range of
stimulation rates that all exhibit the same information transfer.
Analysis of synaptic dynamics during natural spike trains (Fig. 3)
evant activity levels, at least in the case of excitatory hippocampal
synapses. In addition, performing simulations under steady-state
conditions reduces the dynamic range of synaptic strength, which
It is also important to note that our calculations were simpli-
fied by avoiding a postsynaptic neuron firing model, which is
keep our calculations as close to the experimental data as possi-
ble. The most commonly used model in similar studies is the
of free parameters, avoids the nonlinear properties of dendritic
integration, and is difficult to verify experimentally (Burkitt,
Rotmanetal.•InformationTransferbySynapticDynamics J.Neurosci.,October12,2011 • 31(41):14800–14809 • 14807
2006; Brette et al., 2007; Paninski et al., 2007). A more realistic
a previous study of dendrite-to-soma input/output function of
the CA1 pyramidal neurons, demonstrating that this input–out-
put relationship could be modeled as a linear filter followed by
adapting static-gain function (Cook et al., 2007). Application of
such an approach would also require precise knowledge of how
multiple heterogeneous synaptic inputs interact and are spatially
integrated in the dendrites. Given the intricate spatiotemporal
dendritic processing (Spruston, 2008) and complexity of inter-
synaptic interactions over various timescales (Remondes and
Schuman, 2002; Dudman et al., 2007), the problem of linking
individual synaptic dynamics to the actual spiking output of a
a neuronal-spiking model. Moreover, if any neuronal-spiking
dependent manner, it would be advantageous to study these ef-
fects independently of the choice of synaptic STP model. It thus
vidual synapses is modified by complex dendritic processing in
increase in synaptic information transfer is observed over a wide
STP we observed at the level of synaptic output will also be qual-
itatively present at the level of actual spiking output of a neuron,
this entire frequency range.
The key finding of our study is the optimization of information
transmission by STP. In low release probability synapses, informa-
tion transmission is maximal for short high-frequency spike bursts
(Fig. 2). This result demonstrates that the short timescale, ?30
information transfer for variable rate trains, such as natural spike
trains, if they are composed of constant-rate trains of a length that
maximizes information transfer at that rate. Our calculations thus
tion transmission when low-frequency trains of any length are
tion transfer of a synapse. This is indeed in agreement with the ex-
perimentally observed firing patterns of excitatory hippocampal
Based on the same optimization considerations, our analyses
imal information transmission when single spikes rather than
bursts are used as synaptic input. This effect arises from the
switch in synaptic dynamics from facilitation/augmentation to
tation is in agreement with a previous study (Goldman et al.,
2002) showing that depressing synapses reduces information re-
dundancy in spike trains. Indeed, when natural spike trains were
reduced autocorrelation of spike timing, which is equivalent to
our finding of optimization by single spikes. It is tempting to
speculate that STP expression might have evolved in part to op-
responding neurons, as seems to be the case for both excitatory
bursty firing patterns, might have evolved in part to optimize
dynamics in other neural systems will reveal the extent to which
this principle applies to other types of synapses, or whether it is
specific to a subset of circuits or to certain types of information
ing of information encoding, which currently limits application
of information theory to a wider variety of synapses and circuits.
AbbottLF,RegehrWG (2004) Synapticcomputation.Nature431:796–803.
Abbott LF, Varela JA, Sen K, Nelson SB (1997) Synaptic depression and
cortical gain control. Science 275:220–224.
Brette R, Rudolph M, Carnevale T, Hines M, Beeman D, Bower JM, Dies-
mann M, Morrison A, Goodman PH, Harris FC Jr, Zirpe M, Natschla ¨ger
T, Pecevski D, Ermentrout B, Djurfeldt M, Lansner A, Rochel O, Vieville
T,MullerE,DavisonAP,etal. (2007) Simulationofnetworksofspiking
Burkitt AN (2006) A review of the integrate-and-fire neuron model: I. Ho-
mogeneous synaptic input. Biol Cybern 95:1–19.
Chance FS, Nelson SB, Abbott LF (1998) Synaptic depression and the tem-
poral response characteristics of V1 cells. J Neurosci 18:4785–4799.
Chung S, Li X, Nelson SB (2002) Short-term depression at thalamocortical
synapses contributes to rapid adaptation of cortical sensory responses in
vivo. Neuron 34:437–446.
ConnorsBW,GutnickMJ (1990) Intrinsicfiringpatternsofdiverseneocor-
tical neurons. Trends Neurosci 13:99–104.
Cook DL, Schwindt PC, Grande LA, Spain WJ (2003) Synaptic depression
in the localization of sound. Nature 421:66–70.
Cook EP, Guest JA, Liang Y, Masse NY, Colbert CM (2007) Dendrite-to-
soma input/output function of continuous time-varying signals in hip-
pocampal CA1 pyramidal neurons. J Neurophysiol 98:2943–2955.
Deng PY, Klyachko VA (2011) The diverse functions of short-term plasticity
DeWeese MR, Hroma ´dka T, Zador AM (2005) Reliability and representa-
tional bandwidth in the auditory cortex. Neuron 48:479–488.
Dobrunz LE, Stevens CF (1997) Heterogeneity of release probability, facili-
tation, and depletion at central synapses. Neuron 18:995–1008.
Dudman JT, Tsay D, Siegelbaum SA (2007) A role for synaptic inputs at
distal dendrites: instructive signals for hippocampal long-term plasticity.
FentonAA,MullerRU (1998) Placecelldischargeisextremelyvariabledur-
U S A 95:3182–3187.
Fortune ES, Rose GJ (2001) Short-term synaptic plasticity as a temporal
filter. Trends Neurosci 24:381–385.
Fuhrmann G, Segev I, Markram H, Tsodyks M (2002) Coding of temporal
GoldmanMS,MaldonadoP,AbbottLF (2002) Redundancyreductionandsus-
Kandaswamy U, Deng PY, Stevens CF, Klyachko VA (2010) The role of
presynaptic dynamics in processing of natural spike trains in hippocam-
pal synapses. J Neurosci 30:15904–15914.
Klyachko VA, Stevens CF (2006a) Excitatory and feed-forward inhibitory
spike trains. PLoS Biol 4:e207.
Klyachko VA, Stevens CF (2006b) Temperature-dependent shift of balance
J Neurosci 26:6945–6957.
LeutgebS,LeutgebJK,MoserMB,MoserEI (2005) Placecells,spatialmaps
and the population code for memory. Curr Opin Neurobiol 15:738–746.
Lindner B, Gangloff D, Longtin A, Lewis JE (2009) Broadband coding with
dynamic synapses. J Neurosci 29:2076–2088.
Lisman JE (1997) Bursts as a unit of neural information: making unreliable
synapses reliable. Trends Neurosci 20:38–43.
Loebel A, Tsodyks M (2002) Computation by ensemble synchronization
in recurrent networks with synaptic depression. J Comput Neurosci
14808 • J.Neurosci.,October12,2011 • 31(41):14800–14809Rotmanetal.•InformationTransferbySynapticDynamics
MaassW,ZadorAM (1999) Dynamicstochasticsynapsesascomputational
units. Neural Comput 11:903–917.
Maccaferri G, Roberts JD, Szucs P, Cottingham CA, Somogyi P (2000) Cell
surface domain specific postsynaptic currents evoked by identified
GABAergic neurones in rat hippocampus in vitro. J Physiol 524:91–116.
MacLeod KM, Horiuchi TK, Carr CE (2007) A role for short-term synaptic
facilitation and depression in the processing of intensity information in
the auditory brain stem. J Neurophysiol 97:2863–2874.
Markram H, Pikus D, Gupta A, Tsodyks M (1998a) Potential for multiple
mechanisms, phenomena and algorithms for synaptic plasticity at single
synapses. Neuropharmacology 37:489–500.
Markram H, Gupta A, Uziel A, Wang Y, Tsodyks M (1998b) Information
processing with frequency-dependent synaptic connections. Neurobiol
Learn Mem 70:101–112.
Mody I, Pearce RA (2004) Diversity of inhibitory neurotransmission
through GABA(A) receptors. Trends Neurosci 27:569–575.
Murthy VN, Sejnowski TJ, Stevens CF (1997) Heterogeneous release
properties of visualized individual hippocampal synapses. Neuron
Natschla ¨gerT,MaassW,ZadorA (2001) Efficienttemporalprocessingwith
biologically realistic dynamic synapses. Network 12:75–87.
Paninski L, Pillow J, Lewi J (2007) Statistical models for neural encoding,
decoding, and optimal stimulus design. Prog Brain Res 165:493–507.
PatenaudeC,MassicotteG,LacailleJC (2005) Cell-typespecificGABAsyn-
aptic transmission and activity-dependent plasticity in rat hippocampal
stratum radiatum interneurons. Eur J Neurosci 22:179–188.
Puccini GD, Sanchez-Vives MV, Compte A (2007) Integrated mechanisms
Comput Biol 3:e82.
RemondesM,SchumanEM (2002) Directcorticalinputmodulatesplastic-
ity and spiking in CA1 pyramidal neurons. Nature 416:736–740.
Richardson MJ, Melamed O, Silberberg G, Gerstner W, Markram H (2005)
and interneurons to population bursts. J Comput Neurosci 18:323–331.
Shannon CE (1948) A mathematic theory of communication. Bell Sys Tech
Silberberg G, Wu C, Markram H (2004a) Synaptic dynamics control the
timing of neuronal excitation in the activated neocortical microcircuit.
J Physiol 556:19–27.
Silberberg G, Bethge M, Markram H, Pawelzik K, Tsodyks M (2004b) Dy-
namics of population rate codes in ensembles of neocortical neurons.
J Neurophysiol 91:704–709.
Spruston N (2008) Pyramidal neurons: dendritic structure and synaptic in-
tegration. Nat Rev Neurosci 9:206–221.
Taschenberger H, von Gersdorff H (2000) Fine-tuning an auditory synapse
for speed and fidelity: developmental changes in presynaptic waveform,
EPSC kinetics, and synaptic plasticity. J Neurosci 20:9162–9173.
Tsodyks MV, Markram H (1997) The neural code between neocortical py-
ramidal neurons depends on neurotransmitter release probability. Proc
Natl Acad Sci U S A 94:719–723.
Tsodyks M, Pawelzik K, Markram H (1998) Neural networks with dynamic
synapses. Neural Comput 10:821–835.
Varela JA, Sen K, Gibson J, Fost J, Abbott LF, Nelson SB (1997) A quantita-
of rat primary visual cortex. J Neurosci 17:7926–7940.
Wesseling JF, Lo DC (2002) Limit on the role of activity in controlling the
release-ready supply of synaptic vesicles. J Neurosci 22:9708–9720.
Yang Z, Hennig MH, Postlethwaite M, Forsythe ID, Graham BP (2009)
Wide-band information transmission at the calyx of Held. Neural Com-
Zador A (1998) Impact of synaptic unreliability on the information trans-
mitted by spiking neurons. J Neurophysiol 79:1219–1229.
Zucker RS, Regehr WG (2002) Short-term synaptic plasticity. Annu Rev
Rotmanetal.•InformationTransferbySynapticDynamicsJ.Neurosci.,October12,2011 • 31(41):14800–14809 • 14809