Christian Tetzlaff’s research while affiliated with University of Göttingen and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (40)


Figure 1: Method summary. (A) Several topologically identical recurrent networks are constructed, which differ only in their neuronal time constant variance. The homogeneous network contains identical neurons (and the small dispersion in the corresponding distribution on the right is visualization possesses only). (B). The same multi-dimensional chaotic timeseries, resembling multimodal, partially predictable sensory stimuli, derives all networks (left). The resultant state, X(t) (middle), is then used to linearly approximate a family of tasks F θ , that mimic working-memory-related computation (memory recall, forecasting, and nonlinear processing). A few of such tasks are plotted on the right. (C) The approximation accuracy of each network is quantified by the coefficient of determination on the test interval, which measures the (normalized) mismatch between the ground truth and the predicted value.
Figure 2: Intrinsically heterogeneous networks process chaotic timeseries more accurately than homogeneous ones. (A) The schematic of our task space. The highlighted orange and magenta bands on the frontal slice correspond to panels (B) and (C), respectively. (B) Score profile of different networks for linear (d = 1) processing of first input component (k = 1) for various temporal shifts (∆). More heterogeneous networks outperform the homogeneous networks for nearly all shifts. All scores are exponentially transformed into the (0,1) range before visualization. The dashed line at ≈ 0.4 delineates the chance-level score. (C) Same as (B) but for a nonlinear processing (d = 2) of the same component. (D) Score comparison between the homogeneous (xaxis) and multiple heterogeneous (y-axis) networks, for all tasks. Each distinct task parametrization θ = (k, ∆, d) corresponds to one dot. Colors indicate the level of networks' heterogeneity. The diagonal, dotted line, indicates equal performance. For the majority of tasks, heterogeneous networks outperform the homogeneous ones.
Figure 3: Heterogeneity enriches the dynamical state In both rate-(left) and spike-based dynamics (right) higher heterogeneity enlarges the prominence of (linearly) independent modes, resulting in a repertoire with higher dimensionality. Prominence is defined as the ratio between each singular value to their sum.
Figure 4: More heterogeneous networks outperform independent of the average network time constant. The letter-value plots representing the distribution of all 465 tasks in family F, broken down into three complexity tiers, delineated by the background color matching the categories defined in figure 2A: easy (top), medium (middle) and difficult (bottom). In all panels, heterogeneity either elevates the performance of the bulk of tasks or leaves it nearly intact.
Figure 7: Weight dispersion relaxes Dale's law Connectivity matrix (top) and the weight distribution for networks with weight dispersion of (A) σ = 1 and (B) σ = 5.

+4

Robust Computation with Intrinsic Heterogeneity
  • Preprint
  • File available

December 2024

·

27 Reads

·

Christian Tetzlaff

Intrinsic within-type neuronal heterogeneity is a ubiquitous feature of biological systems, with well-documented computational advantages. Recent works in machine learning have incorporated such diversities by optimizing neuronal parameters alongside synaptic connections and demonstrated state-of-the-art performance across common benchmarks. However, this performance gain comes at the cost of significantly higher computational costs, imposed by a larger parameter space. Furthermore, it is unclear how the neuronal parameters, constrained by the biophysics of their surroundings, are globally orchestrated to minimize top-down errors. To address these challenges, we postulate that neurons are intrinsically diverse, and investigate the computational capabilities of such heterogeneous neuronal parameters. Our results show that intrinsic heterogeneity, viewed as a fixed quenched disorder, often substantially improves performance across hundreds of temporal tasks. Notably, smaller but heterogeneous networks outperform larger homogeneous networks, despite consuming less data. We elucidate the underlying mechanisms driving this performance boost and illustrate its applicability to both rate and spiking dynamics. Moreover, our findings demonstrate that heterogeneous networks are highly resilient to severe alterations in their recurrent synaptic hyperparameters, and even recurrent connections removal does not compromise performance. The remarkable effectiveness of heterogeneous networks with small sizes and relaxed connectivity is particularly relevant for the neuromorphic community, which faces challenges due to device-to-device variability. Furthermore, understanding the mechanism of robust computation with heterogeneity also benefits neuroscientists and machine learners.

Download

Figure 6: Poisson Disc Sampling of polydisperse spheres. (A) Example distribution for three different sized particle types confined to the volume of a mesh compartment. (B) Poisson Disc sampling for surface molecules. (C) Poisson Disc sampling for surface molecules being restricted to a surface region that is defined by a triangle face group.
Figure 8: Visualization of molecule trajectories with PyRIDs Blender addon. Left: Example visualization with 50.000 particles. Right: GUI of the Blender addon.
Figure 10: MSD and rotational relaxation times of a rigid bead molecule matches the theoretical prediction. (A) Mean squared displacement (MSD) of the rigid bead molecule computed with PyRID. The displacement in each dimension (colored markers) is in very good agreement with the theory (black line). (B) The rotational relaxation of the rigid bead molecule is also in close agreement with theory (gray lines) for each of the the rotation axes (colored markers).
Figure 16: Performance test of the hierarchical grid approach. (A) Performance hierarchical grid. (B) Performance comparison between PyRID and ReaDDy. On a benchmark system with an Intel Core i5-9300H with 2.4 GHz and 24 GB DDR4 RAM, PyRID (blue line) outperforms ReaDDy (yellow). However, [11] obtained a better performance and especially scaling for ReaDDy on a different machine with an Intel Core i7 6850K processor at 3.8GHz and 32GB DDR4 RAM (green line).
PyRID: A Brownian dynamics simulator for reacting and interacting particles written in Python

November 2024

·

28 Reads

Recent technological developments in molecular biology led to large data sets providing new insights into the molecular organisation of cells. To fully exploit their potential, these developments have to be complemented by computer simulations that allow to gain in-depth understanding on molecular principles. We developed the Python-based, reaction-diffusion simulator PyRID, integrating many features for the efficient simulation of molecular biological systems. Amongst others, PyRID is capable of simulating unimolecular and bimolecular reactions as well as pairinteractions to assess the dynamics resulting from individual interacting proteins to polydisperse substrates consisting of many different molecules. Furthermore, PyRID supports mesh-based compartments and surface diffusion of particles, enabling analyses of the interaction between (trans-)membrane proteins with intra- and extracellular proteins. PyRID is written entirely in Python, which is a programming language being known for its readability and easy accessibility, such that the scientific community can easily extend PyRID to its current and future needs.


Calcium-based input timing learning

November 2024

·

16 Reads

·

1 Citation

A bstract Stimulus-triggered synaptic long-term plasticity is the foundation of learning and other cognitive abilities of the brain. In general, long-term synaptic plasticity is subdivided into two different forms: homosynaptic plasticity describes synaptic changes at stimulated synapses, while heterosynaptic plasticity summarizes synaptic changes at non-stimulated synapses. For homosynaptic plasticity, the Ca ²⁺ -hypothesis pinpoints the calcium concentration within a stimulated dendritic spine as key mediator or controller of underlying biochemical and -physical processes. On the other hand, for heterosynaptic plasticity, although theoretical studies attribute important functional roles to it, such as synaptic competition and cooperation, experimental results remain ambiguous regarding its manifestation and biological basis. By integrating insights from Ca ²⁺ -dependent homosynaptic plasticity with experimental data of dendritic Ca ²⁺ -dynamics, we developed a mathematical model that describes the complex temporal and spatial dynamics of calcium in dendrites.We show that the influx of calcium into a stimulated spine can lead to its diffusion to neighboring spines, triggering heterosynaptic effects such as synaptic competition or cooperation. By considering different input strengths, our model explains the ambiguity of reported experimental results of heterosynaptic plasticity, suggesting that the Ca ²⁺ -hypothesis of homosynaptic plasticity can be extended to also model heterosynaptic plasticity. Furthermore, our model predicts thata synapse can modulate the expression of homosynaptic plasticity at a neighboring synapse in an input-timing-dependent manner, without the need of postsynaptic spiking. The resulting sensitivity of synaptic plasticity on input-spike-timing can be influenced by the distance between involved spines as well as the local diffusion properties of the connecting dendritic shaft, providing a new way of dendritic computation.


Morphological correlates of synaptic protein turnover in the mouse brain

August 2024

·

62 Reads

Life Science Alliance

Fengxia Li

·

Julius N Bahr

·

·

[...]

·

Silvio O Rizzoli

Synaptic proteins need to be replaced regularly, to maintain function and to prevent damage. It is unclear whether this process, known as protein turnover, relates to synaptic morphology. To test this, we relied on nanoscale secondary ion mass spectrometry, to detect newly synthesized synaptic components in the brains of young adult (6 mo old) and aged mice (24 mo old), and on transmission electron microscopy, to reveal synapse morphology. Several parameters correlated to turnover, including pre- and postsynaptic size, the number of synaptic vesicles and the presence of a postsynaptic nascent zone. In aged mice, the turnover of all brain compartments was reduced by ∼20%. The turnover rates of the pre- and postsynapses correlated well in aged mice, suggesting that they are subject to common regulatory mechanisms. This correlation was poorer in young adult mice, in line with their higher synaptic dynamics. We conclude that synapse turnover is reflected by synaptic morphology.


Schematics of the model used to investigate ripple activity and plasticity
(A) A predefined excitatory feed-forward structure was embedded in an excitatory-inhibitory network and its evolution with and without stimulation as well as with different sensitivities of nonlinear dendrites was tested. (B) Factor applied to the synaptic weights within the feed-forward structure depending on the index difference between post- and presynaptic neuron (spatial kernel). (C) Weight change depending on time difference between post- and presynaptic spikes (STDP-curve)
Ripple activity for spontaneous and stimulated replays
(A) Time-course of the mean firing rate for all neurons in different network configurations. (B-C) Absolute value of continuous wavelet transform coefficient over relevant frequency range averaged over 30 s before (B) and after rest phase (C). Ripple frequency is about 150 Hz. (D-F) Top panel: mean rates of feed-forward (gray) and inhibitory (red) neuron populations and all neurons (green). Middle panel: time-dependent continous wavelet transform. Bottom panel: Spike raster-plots of the different configurations at different time points. Left: before rest phase; Middle: rest phase; Right: after rest phase. Yellow boxes mark neurons with additional stimulation. (D) Low-excitability network with stimulation. (E) High excitability network without stimulation. (F) High excitability network with stimulation.
Synaptic weight changes induced by rest phase for different network configurations
(A-C) Synaptic weights before (green curve and points) and after rest phase (red curve and points) depending on the distance between place fields of post- and presynaptic neuron within the feed-forward structure. (D-F) Change of the synaptic weights during the rest phase depending on pre- and postsynaptic neuron index. Neurons 100–300 represent the feed-forward structure with strong initial synaptic weights above the main diagonal (dashed line). (G-I) Time courses for individual synaptic weights (initial weight indicated by trace color). Dashed lines mark the threshold for non-linear dendritic integration.
Results from a reduced model of plasticity dynamics in feed-forward structure
(A) Reduced model for studying parameter dependency of potentiation at a single postsynaptic neuron within a feed-forward structure with N neurons at every stage. (B-C) Change in synaptic weight as measured by conductance g0 (color-coded) of a single synapse with Poisson input with 5 Hz (B) or 10 Hz (D) for different initial weights (y-axis) and thresholds for non-linear dendritic integration (x-axis). Potentiation is additionally highlighted with red bounding boxes. Green dashed lines highlight the dendritic thresholds for low and high excitability. (C) Number of dendritic spikes within the simulations from panel B. Large weight changes occur only in the regimes where dendritic spikes are prevalent. The smallest weights for which potentiation occurs increase for increasing dendritic thresholds. Above a certain initial weight, depression is observed, which can be attributed to synaptic scaling. Hence, there is a stable fixed point at a high synaptic weight, corresponding to a consolidated connection. The transition between LTP and LTD occurs at smaller weights, when cells are more active. (E-F) Dependency of weight change on initial weight (y-axis), correlation between pre-synaptic Poisson spike trains (x-axis) and the number of such spike trains (columns, N indicated in title). For the conditions of the spontaneously active network (panel E), positive weight change occurs already for medium weights and for small numbers of presynaptic neurons and small correlation, whereas networks with low excitability need higher correlations or weights (panel F).
Hypothesized progression in the consolidation of episodic memories
(A-D) Schematic of the proposed progression: Immediately after learning (panel B), dendrites become more excitable (red circles) and external stimulation triggers replays (lightning symbol), such that also weak, memory-related synapses are potentiated. Later-on, the external stimulation ceases (panel C), but dendrites still have enhanced dendritic excitability, such that all sufficiently strong synapses continue to grow. Finally dendritic excitability decreases, but the feed-forward structure is sufficiently potentiated to still spontaneously reactivate and, thereby, maintain strong synaptic weights. (E) Time-evolution of synasptic weights without stimulation or enhanced dendritic excitability. All weights decay. (F) Time evolution of synaptic weights in a network undergoing the progression in panels A-D. After transient stimulation and enhanced dendritic excitability, the memory is strong enough to sustain itself. Solid grey lines indicate transition between phases whereas dashed grey lines mark the dendritic spiking threshold of the respective phase. (G-I) The weights after the different phases in B-D mapped to the distance between the pre- and postsynaptic neurons place-field. The feed-forward nature is preserved.
Differences in the consolidation by spontaneous and evoked ripples in the presence of active dendrites

June 2024

·

33 Reads

Ripples are a typical form of neural activity in hippocampal neural networks associated with the replay of episodic memories during sleep as well as sleep-related plasticity and memory consolidation. The emergence of ripples has been observed both dependent as well as independent of input from other brain areas and often coincides with dendritic spikes. Yet, it is unclear how input-evoked and spontaneous ripples as well as dendritic excitability affect plasticity and consolidation. Here, we use mathematical modeling to compare these cases. We find that consolidation as well as the emergence of spontaneous ripples depends on a reliable propagation of activity in feed-forward structures which constitute memory representations. This propagation is facilitated by excitable dendrites, which entail that a few strong synapses are sufficient to trigger neuronal firing. In this situation, stimulation-evoked ripples lead to the potentiation of weak synapses within the feed-forward structure and, thus, to a consolidation of a more general sequence memory. However, spontaneous ripples that occur without stimulation, only consolidate a sparse backbone of the existing strong feed-forward structure. Based on this, we test a recently hypothesized scenario in which the excitability of dendrites is transiently enhanced after learning, and show that such a transient increase can strengthen, restructure and consolidate even weak hippocampal memories, which would be forgotten otherwise. Hence, a transient increase in dendritic excitability would indeed provide a mechanism for stabilizing memories.


Conceptual illustration of information storage in working memory. The neural network can store and recall features of an item. The volatile nature of the synapses allows the memory of the stored features to fade in time. After depletion of the memory, the features of a different item can be stored without significant interferences.
Ag-based volatile memristive device characterization. (A) Sketch of the one-transistor/one-resistor (1T1R) RRAM device together with its working principle. The memristive device (1 R) is based on a W/C/10 nm HfO 2 /Ag stack. The RRAM shows a volatile behavior, i.e. a set operation together with a spontaneous switch off. (B) Time characterization of the retention of the filament. After a 5 V amplitude triangular pulse to switch the cell on with an ICC  = 20 µ A, a constant reading voltage of −150 mV is applied to monitor the retention. (C) Retention time distributions at different compliance currents. The median value of the retention time increases with the compliance current. Inset: applied programming pulse. (D) Switching probability of the device for a single pulse as a function of the amplitude and the pulse duration of the programming pulse. Shorter pulses required higher voltage amplitudes to switch ON. (E) Probability of switching the RRAM to the LRS depending on the number of pulses and their amplitude (1 ms pulses). The circles are the experimental data while the solid lines is the fitting. (F) Effect of the number of pulses and the voltage amplitude (1.6 V) on the switching of the device. The switching of the device is stochastic. Considering a group (burst) of pulses, the probability that the device is in the LRS inside the group increases.
WM store and recall experiment. (A) High-level sketch of the working memory. (B) Schematic of the WM implementation: five volatile 1T1R memristive devices are arranged in parallel configuration. The gate is chosen to set ICC  = 17 µ A, that corresponds to a retention time of 28 ms. (C) Color–pattern encoding. (D) Store and recall experiment. During the store phase, a single pattern is fed to the network. Top colored plot: input stimuli. For ease of visualization, each pattern is colored as the color it encodes. Black dots in the bottom part of the upper plot indicate the stored pattern. Bottom plot: measure current fed to the post-neuron. The current threshold for recognition is indicated as a dashed black horizontal line. The traces are cropped on the x-axes to better highlight the salient events. (E) Correlation plot between the expected and measured currents based on the difference between the presented and the stored pattern. Results obtained from ten different store and recall experiments with P ON   =  5% and stimulation frequency f stim  = 50 Hz. (F) Accuracy of the system in distinguishing the stored pattern under different stimulation and switching conditions. (G) Average current error, defined as the difference between the measured current and the expected current, during 100 patterns applied for the different conditions.
Large-scale simulation of WM. (A) Illustration of the network model. Five different memory items (A, B, C, D and E) can be stored in a recurrent network of spiking neurons. Corresponding strongly connected populations within the network transiently store memory items after activation. (B) Network activity of the WM model. Black dots show individual spikes of input (top) and network (bottom) neurons. Multiple phases of store and recall are shown. Insets show average firing rates (spikes per second in Hz) over recall phases. Data obtained using a current compliance of 330 µ A, corresponding to average retention times of 1.5 s, and a voltage amplitude of 0.5 V, corresponding to a switching probability of 5%, were used in this simulation. Network behavior using (C) different retention time distribution (change of compliance current) and (D) different switching probability (change of applied voltage amplitude).
Associative symbolic WM with volatile memristive devices. (A) Illustration of the network for associative symbolic WM. (B) Sequence of store and recall. Store/recall input (top row) and WM neuron output spikes (bottom row) are shown. Inserted pictograms represent the decoded objects and recall queries. Store/recall inputs had a one-to-one fixed connectivity to WM neurons. After storing an association it can be recalled by cuing the network with an arbitrary memory element. (C) Decoding error plotted as a function of time delay between store and first recall.
Tunable synaptic working memory with volatile memristive devices

October 2023

·

120 Reads

·

3 Citations

Different real-world cognitive tasks evolve on different relevant timescales. Processing these tasks requires memory mechanisms able to match their specific time constants. In particular, the working memory utilizes mechanisms that span orders of magnitudes of timescales, from milliseconds to seconds or even minutes. This plentitude of timescales is an essential ingredient of working memory tasks like visual or language processing. This degree of flexibility is challenging in analog computing hardware because it requires the integration of several reconfigurable capacitors of different size. Emerging volatile memristive devices present a compact and appealing solution to reproduce reconfigurable temporal dynamics in a neuromorphic network. We present a demonstration of working memory using a silver-based memristive device whose key parameters, retention time and switching probability, can be electrically tuned and adapted to the task at hand. First, we demonstrate the principles of working memory in a small scale hardware to execute an associative memory task. Then, we use the experimental data in two larger scale simulations, the first featuring working memory in a biological environment, the second demonstrating associative symbolic working memory.


Figure 1. Comparison between calcium, sulfur, and nitrogen peaks as observed by nanoscale secondary ion mass spectrometry. (A) Nanoscale secondary ion mass spectrometry is a chemical-imaging technique which uses an ion source (Cs + or O − ) to produce a charged primary ion beam. When the beam strikes the surface of the sample, secondary ions of the opposite charge are created and measured. To analyze both 14 N and 40 Ca in our samples, it was necessary to first analyze the samples using the Cs + source. Once all samples were analyzed using Cs + , the instrument was switched over to a radio-frequency RF source, and 40 Ca was measured in a second run. (B) The intensity of the 40 Ca peak at various points from each image is compared with the intensity of the 14 N peak at the same location. The data are divided into two groups: a Ca 2+ -binding group ( 40 Ca > 2, N = 104, P = 0.0123), and a non-Ca 2+ -binding group ( 40 Ca < 2, N = 462, P = 1.68 × 10 −9 ). (C) The intensity of the 40 Ca peak at various points from each image is compared with the intensity of the 32 S peak at the same location. The data are divided into two groups: a Ca 2+ -binding group ( 40 Ca > 2, N = 104, P = 0.0001), and a non-Ca 2+ -binding group ( 40 Ca < 2, N = 462, P = 0.0317). Source data are available for this figure.
High-resolution analysis of bound Ca 2+ in neurons and synapses

October 2023

·

72 Reads

·

2 Citations

Life Science Alliance

Calcium (Ca ²⁺ ) is a well-known second messenger in all cells, and is especially relevant for neuronal activity. Neuronal Ca ²⁺ is found in different forms, with a minority being freely soluble in the cell and more than 99% being bound to proteins. Free Ca ²⁺ has received much attention over the last few decades, but protein-bound Ca ²⁺ has been difficult to analyze. Here, we introduce correlative fluorescence and nanoscale secondary ion mass spectrometry imaging as a tool to describe bound Ca ²⁺ . As expected, bound Ca ²⁺ is ubiquitous. It does not correlate to free Ca ²⁺ dynamics at the whole-neuron level, but does correlate significantly to the intensity of markers for GABAergic pre-synapse and glutamatergic post-synapses. In contrast, a negative correlation to pre-synaptic activity was observed, with lower levels of bound Ca ²⁺ observed in the more active synapses. We conclude that bound Ca ²⁺ may regulate neuronal activity and should receive more attention in the future.


Tunable Synaptic Working Memory with Volatile Memristive Devices

June 2023

·

283 Reads

Different real-world cognitive tasks evolve on different relevant timescales. Processing these tasks requires memory mechanisms able to match their specific time constants. In particular, the working memory utilizes mechanisms that span orders of magnitudes of timescales, from milliseconds to seconds or even minutes. This plentitude of timescales is an essential ingredient of working memory tasks like visual or language processing. This degree of flexibility is challenging in analog computing hardware because it requires the integration of several reconfigurable capacitors of different size. Emerging volatile memristive devices present a compact and appealing solution to reproduce reconfigurable temporal dynamics in a neuromorphic network. We present a demonstration of working memory using a silver-based memristive device whose key parameters, retention time and switching probability, can be electrically tuned and adapted to the task at hand. First, we demonstrate the principles of working memory in a small scale hardware to execute an associative memory task. Then, we use the experimental data in two larger scale simulations, the first featuring working memory in a biological environment, the second demonstrating associative symbolic working memory.


FIGURE E (A) Input trace of a single synapse (left) and voltage trace (right) of a neuron. The neuron receives randomly timed excitatory and inhibitory input spikes. The emulator (yellow) matches Loihi (blue) in both cases perfectly. Note that Loihi uses the voltage register to count refractory time, which results in a functionally irrelevant diierence after a spike, e.g time step pp in AA (see Appendix x....). (B) Network simulation with hhh excitatory (indices sss − ) and ddd inhibitory (indices s − ) neurons. The network is driven by noise from an input population of ff Poisson spike generators with a connection probability of f.... All spikes match exactly between the emulator and Loihi for all time steps. The figure shows the last ttt time steps from a simulation with hhh, ,,, time steps.
FIGURE Comparing a STDP learning rule performed with the emulator and with Loihi. (A) Sketch showing the setup. (B) Synaptic trace for many trials showing the arithmetic mean and standard deviation. The inset shows the same data in a logarithmic scale. Note that every data point smaller than nn shows the probability of rounding values between n and d up or down. (C) Relative diierence | ˜ w L − ˜ w B |/ ˜ w max for the plastic weight between the emulator, ˜ w B , and the Loihi implementation, ˜ w L , for rr simulations, ˜ w max = . (D) STDP weight change in respect to pre-and post-synaptic spike times, data shown for time steps s − for visualization purposes.
Brian2Loihi: An emulator for the neuromorphic chip Loihi using the spiking neural network simulator Brian

November 2022

·

123 Reads

·

15 Citations

Developing intelligent neuromorphic solutions remains a challenging endeavor. It requires a solid conceptual understanding of the hardware's fundamental building blocks. Beyond this, accessible and user-friendly prototyping is crucial to speed up the design pipeline. We developed an open source Loihi emulator based on the neural network simulator Brian that can easily be incorporated into existing simulation workflows. We demonstrate errorless Loihi emulation in software for a single neuron and for a recurrently connected spiking neural network. On-chip learning is also reviewed and implemented, with reasonable discrepancy due to stochastic rounding. This work provides a coherent presentation of Loihi's computational unit and introduces a new, easy-to-use Loihi prototyping package with the aim to help streamline conceptualization and deployment of new algorithms.


Decision Making by a Neuromorphic Network of Volatile Resistive Switching Memories

November 2022

·

60 Reads

The necessity of having an electronic device working in relevant biological time scales with a small footprint boosted the research of a new class of emerging memories. Ag-based volatile resistive switching memories (RRAMs) feature a spontaneous change of device conductance with a similarity to biological mechanisms. They rely on the formation and self-disruption of a metallic conductive filament through an oxide layer, with a retention time ranging from a few milliseconds to several seconds, greatly tunable according to the maximum current which is flowing through the device. Here we prove a neuromorphic system based on volatile-RRAMs able to mimic the principles of biological decision-making behavior and tackle the Two-Alternative Forced Choice problem, where a subject is asked to make a choice between two possible alternatives not relying on a precise knowledge of the problem, rather on noisy perceptions.


Citations (14)


... Spiny structures on the dendrites can serve to receive synaptic inputs and at the same time undergo plastic changes [43]. Here, we consider a model that describes a number of such spines on a single dendritic branch [44]. The state and strength of these spines are subject to the previously mentioned calcium-based plasticity rule [21]. ...

Reference:

Plastic Arbor: a modern simulation framework for synaptic plasticity – from single synapses to networks of morphological neurons
Calcium-based input timing learning

... After several generations of evolution, electrophysiology has evolved from the early stage of synchronous electrical activities of a large number of cells, to non-invasive microelectrodes at the cellular level (Fenton Tomasello and Wlodkowic 2022). Calcium ions can be used as markers of neuronal excitability, and their dynamic imaging capabilities within cells have long been a focus of attention (Bonnin et al. 2024). Calcium imaging originated in the mid-1970s, proposed by Blinks et al. (Drew et al. 2019). ...

High-resolution analysis of bound Ca 2+ in neurons and synapses

Life Science Alliance

... Brian is hardware-agnostic and targets the neuroscience community for simulating large-scale SNN models. Brian2Loihi [14] is an emulator designed for implementing spiking neural networks on Loihi. It differs from SENSIM as it focuses on single small-scale recurrent neural networks with on-chip learning, while SENSIM primarily concentrates on inference emulation of large-scale SNN models. ...

Brian2Loihi: An emulator for the neuromorphic chip Loihi using the spiking neural network simulator Brian

... In the late 1980s, Kawato and Gomi [1] and Miyamoto et al. [2] first proposed an approach using neurorobotics to construct a series of robotic devices that investigated the adaptation of the cerebellum to motion. Since the mid-twentieth century, neuroscience has proven the significance of neurorobotics in several fields (for example, computer vision) based on its development in robot learning [3]. Notably, artificial intelligence (AI) is vital to computer vision because it facilitates deep learning [4]. ...

Editorial: Robust Artificial Intelligence for Neurorobotics

Frontiers in Neurorobotics

... Several examples of silicon dendrites have been previously modeled using analog and mixed-signal CMOS chips. These include silicon dendrites with passive/active properties and multiple compartments [84,6,38], silicon dendrites that use active channels [99], and multi-compartment silicon dendrites with N-methyl-D-aspartate (NMDA) and calcium (Ca 2+ ) channels [57,103]. These in silico dendrites not only model integration of synaptic memory along the dendritic cable, but also model key ion-channel properties to capture dynamics observed in dendrites. ...

Emulating Dendritic Computing Paradigms on Analog Neuromorphic Hardware
  • Citing Article
  • August 2021

Neuroscience

... In addition, CA2 sends excitatory projections to all other hippocampal regions and receives input from the entorhinal cortex and all other hippocampal regions except for CA1. CA2 has a fundamental role in hippocampal information processing [27]. ...

CA2 beyond social memory: Evidence for a fundamental role in hippocampal information processing
  • Citing Article
  • March 2021

Neuroscience & Biobehavioral Reviews

... Unconstrained potentiation of synapses, for example, introduces a positive feedback 7 loop that increases activity correlations which in turn recruits more potentiation leading 8 to pathological behaviour [4,5]. Synaptic changes due to long-term potentiation occur 9 over multiple timescales [6,7] and require compensatory mechanisms [8] to avoid such 10 pathological feedback loops. Several homeostatic mechanisms are thought to reign in 11 rapid synaptic growth which typically unfolds within seconds to minutes [9, 10]. ...

The biophysical basis underlying the maintenance of early phase long-term potentiation

... The assignments involve circumstantial panic remembrance and the Morris water maze [10]. As per the day of training, there should be significant requirements upon the processing of hippocampus and synaptic plasticity due to cognitive processes taking place during the day, be it "Morning" or "Evening" chronotype [19]. The role of mTOR components in long-term plasticity and new memory building is evident from the fact that aspects such as the axon of the Peripheral Nervous System (PNS) are resurrected, and the presynaptic terminals' ubiquitous protein synthesis mechanisms are proximally translated mTOR of the Central Nervous System (CNS) axons [20]. ...

A local circadian clock for memory?
  • Citing Article
  • January 2021

Neuroscience & Biobehavioral Reviews

... Michaelis et al. presented an algorithm for training a SNN to do robust trajectory planning on a Loihi neuromorphic chip [112]. The work made use of neuronal motifs to initialize and learn the features required to generate complex robotic movements. ...

Robust Trajectory Generation for Robotic Control on the Neuromorphic Research Chip Loihi

Frontiers in Neurorobotics

... Similarly, rates of protein mobility would have different effects on synaptic activity in pre-and postsynapses. Repeated stimulation has been estimated to quickly deplete the available pool of clathrin in the presynapses, 55 leading to inefficient endocytosis. An increase of clathrin mobility rates could compensate for such a clathrin deficit, allowing higher activity rates. ...

Quantitative Synaptic Biology: A Perspective on Techniques, Numbers and Expectations