ArticlePDF AvailableLiterature Review

Abstract and Figures

Neurons carry out the many operations that extract meaningful information from sensory receptor arrays at the organism’s periphery and translate these into action, imagery and memory. Within today’s dominant computational paradigm, these operations, involving synapses, membrane ionic channels and changes in membrane potential, are thought of as steps in an algorithm or as computations. The role of neurons in these computations has evolved conceptually from that of a simple integrator of synaptic inputs until a threshold is reached and an output pulse is initiated, to a much more sophisticated processor with mixed analog-digital logic and highly adaptive synaptic elements.
Content may be subject to copyright.
A preview of the PDF is not available
... Developing the next generation of artificial intelligence is an interdisciplinary endeavor, building upon advancements in computer science, neuroscience, and hardware. Two general lessons have emerged simultaneously from both engineering pursuits and naturalistic inquiry: (1) a neural system's performance typically improves with increasing number and connectivity of processing primitives, both in biology [1,2,3] and artificial intelligence [4,5]; and (2) analog processing with spiking communication in systems exhibiting complex temporal dynamics is both physically efficient and computationally powerful, again in the biological [6,7,3] or artificial domain [8,9]. The circuits presented in this work are designed to embrace both of these ideas by implementing synapses that receive singlephoton communication events and perform analog computation with superconducting electronics. ...
... 6.25 µs, 500 nH synapse integrating behavior with different frequency pulse trains (columns), different values of I sy (rows), and different numbers of pulses (colors). ...
... 6.25 µs, 2.5 µH synapse integrating behavior with different frequency pulse trains (columns), different numbers of pulses in a burst (rows), and different synaptic weights (colors). ...
Preprint
Superconducting optoelectronic hardware is being explored as a path towards artificial spiking neural networks with unprecedented scales of complexity and computational ability. Such hardware combines integrated-photonic components for few-photon, light-speed communication with superconducting circuits for fast, energy-efficient computation. Monolithic integration of superconducting and photonic devices is necessary for the scaling of this technology. In the present work, superconducting-nanowire single-photon detectors are monolithically integrated with Josephson junctions for the first time, enabling the realization of superconducting optoelectronic synapses. We present circuits that perform analog weighting and temporal leaky integration of single-photon presynaptic signals. Synaptic weighting is implemented in the electronic domain so that binary, single-photon communication can be maintained. Records of recent synaptic activity are locally stored as current in superconducting loops. Dendritic and neuronal nonlinearities are implemented with a second stage of Josephson circuitry. The hardware presents great design flexibility, with demonstrated synaptic time constants spanning four orders of magnitude (hundreds of nanoseconds to milliseconds). The synapses are responsive to presynaptic spike rates exceeding 10 MHz and consume approximately 33 aJ of dynamic power per synapse event before accounting for cooling. In addition to neuromorphic hardware, these circuits introduce new avenues towards realizing large-scale single-photon-detector arrays for diverse imaging, sensing, and quantum communication applications.
... Even passive properties of dendrites have a strong effect on the neuronal computation (reviewed in [59,63,64]) by filtering high frequencies of the inputs (act as a low pass filter); they are also a source of a current leak. Consequently, the signals are attenuated and delayed as they travel along the dendrites. ...
... Dendrites are also known to have active computational properties [59,63,64], including dendritic APs which travel from the some to the dendrite (opposite to the well-known APs travelling from the soma along the axon) [65]. The interaction between the active dendrites and the soma can further give raise to internal activity patterns, a form of a feedback mechanism between these structures. ...
... The increasing interest in more complex neuron models with dendritic compartments is also fuelled by the realisation that a single biological neuron performs computation comparable to whole ANNs composed of point neurons [91,92,93] or to standard signal processing methods, including low and band-pass filtering, normalisation, gain control, saturation, amplification, multiplication and thresholding [66,64]. A recent study demonstrated that an ANN with 5 to 8 layers is required to fit a model of a single multicompartmental cortical pyramidal neuron model [93]. ...
Conference Paper
Spiking neural networks aspire to mimic the brain more closely than traditional artificial neural networks. They are characterised by a spike-like activation function inspired by the shape of an action potential in biological neurons. Spiking networks remain a niche area of research, perform worse than the traditional artificial networks, and their real-world applications are limited. We hypothesised that neuroscience-inspired spiking neural networks with spike-timing-dependent plasticity demonstrate useful learning capabilities. Our objective was to identify features which play a vital role in information processing in the brain but are not commonly used in artificial networks, implement them in spiking networks without copying constraints that apply to living organisms, and to characterise their effect on data processing. The networks we created are not brain models; our approach can be labelled as artificial life. We performed a literature review and selected features such as local weight updates, neuronal sub-types, modularity, homeostasis and structural plasticity. We used the review as a guide for developing the consecutive iterations of the network, and eventually a whole evolutionary developmental system. We analysed the model’s performance on clustering of spatio-temporal data. Our results show that combining evolution and unsupervised learning leads to a faster convergence on the optimal solutions, better stability of fit solutions than each approach separately. The choice of fitness definition affects the network’s performance on fitness-related and unrelated tasks. We found that neuron type-specific weight homeostasis can be used to stabilise the networks, thus enabling longer training. We also demonstrated that networks with a rudimentary architecture can evolve developmental rules which improve their fitness. This interdisciplinary work provides contributions to three fields: it proposes novel artificial intelligence approaches, tests the possible role of the selected biological phenomena in information processing in the brain, and explores the evolution of learning in an artificial life system.
... In the viewpoint of Charles F. Stevens [98], models are common in neuroscience, but theories are relatively scarce. Neuroscience has amassed various models to describe specific phenomena, but few theories offer general frameworks for a wide range of facts and find the underlying connections between different issues. ...
Preprint
Full-text available
The brain works as a dynamic system to process information. Various challenges remain in understanding the connection between information and dynamics attributes in the brain. The present research pursues exploring how the characteristics of neural information functions are linked to neural dynamics. We attempt to bridge dynamics (e.g., Kolmogorov-Sinai entropy) and information (e.g., mutual information and Fisher information) metrics on the stimulus-triggered stochastic dynamics in neural populations. On the one hand, our unified analysis identifies various essential features of the information-processing-related neural dynamics. We discover spatiotemporal differences in the dynamic randomness and chaotic degrees of neural dynamics during neural information processing. On the other hand, our framework reveals the fundamental role of neural dynamics in shaping neural information processing. The neural dynamics creates an oppositely directed variation of encoding and decoding properties under specific conditions, and it determines the neural representation of stimulus distribution. Overall, our findings demonstrate a potential direction to explain the emergence of neural information processing from neural dynamics and help understand the intrinsic connections between the informational and the physical brain.
... Model comparison on ion channel models Ion channels play a key role in neuronal dynamics (Koch and Segev, 2000). Mechanistic models of ion channels are therefore a central object of computational neuroscience. ...
Preprint
Full-text available
A common problem in natural sciences is the comparison of competing models in the light of observed data. Bayesian model comparison provides a statistically sound framework for this comparison based on the evidence each model provides for the data. However, this framework relies on the calculation of likelihood functions which are intractable for most models used in practice. Previous approaches in the field of Approximate Bayesian Computation (ABC) circumvent the evaluation of the likelihood and estimate the model evidence based on rejection sampling, but they are typically computationally intense. Here, I propose a new efficient method to perform Bayesian model comparison in ABC. Based on recent advances in posterior density estimation, the method approximates the posterior over models in parametric form. In particular, I train a mixture-density network to map features of the observed data to the posterior probability of the models. The performance is assessed with two examples. On a tractable model comparison problem, the underlying exact posterior probabilities are predicted accurately. In a use-case scenario from computational neuroscience -- the comparison between two ion channel models -- the underlying ground-truth model is reliably assigned a high posterior probability. Overall, the method provides a new efficient way to perform Bayesian model comparison on complex biophysical models independent of the model architecture.
Chapter
It has previously been shown that it is possible to derive a new class of biophysically detailed brain tissue models when one computationally analyzes and exploits the interdependencies or the multi-modal and multi-scale organization of the brain. These reconstructions, sometimes referred to as digital twins, enable a spectrum of scientific investigations. Building such models has become possible because of increase in quantitative data but also advances in computational capabilities, algorithmic and methodological innovations. This chapter presents the computational science concepts that provide the foundation to the data-driven approach to reconstructing and simulating brain tissue as developed by the EPFL Blue Brain Project, which was originally applied to neocortical microcircuitry and extended to other brain regions. Accordingly, the chapter covers aspects such as a knowledge graph-based data organization and the importance of the concept of a dataset release. We illustrate algorithmic advances in finding suitable parameters for electrical models of neurons or how spatial constraints can be exploited for predicting synaptic connections. Furthermore, we explain how in silico experimentation with such models necessitates specific addressing schemes or requires strategies for an efficient simulation. The entire data-driven approach relies on the systematic validation of the model. We conclude by discussing complementary strategies that not only enable judging the fidelity of the model but also form the basis for its systematic refinements.
Chapter
Whole-Brain Modelling is a scientific field with a short history and a long past. Its various disciplinary roots and conceptual ingredients extend back to as early as the 1940s. It was not until the late 2000s, however, that a nascent paradigm emerged in roughly its current form-concurrently, and in many ways joined at the hip, with its sister field of macro-connectomics. This period saw a handful of seminal papers authored by a certain motley crew of notable theoretical and cognitive neuroscientists, which have served to define much of the landscape of whole-brain modelling as it stands at the start of the 2020s. At the same time, the field has over the past decade expanded in a dozen or more fascinating new methodological, theoretical, and clinical directions. In this chapter we offer a potted Past, Present, and Future of whole-brain modelling, noting what we take to be some of its greatest successes, hardest challenges, and most exciting opportunities.
Article
This article presents the argument that, while understanding the brain will require a multi-level approach, there is nevertheless something fundamental about understanding the components of the brain. I argue here that the standard description of neurons is not merely too simplistic, but also misses the true nature of how they operate at the computational level. In particular, the humble point neuron, devoid of dendrites with their powerful computational properties, prevents conceptual progress at higher levels of understanding.
Article
We explored a computational model of astrocytic energy metabolism and demonstrated the theoretical plausibility that this type of pathway might be capable of coding information about stimuli in addition to its known functions in cellular energy and carbon budgets. Simulation results indicate that glycogenolytic glycolysis triggered by activation of adrenergic receptors can capture the intensity and duration features of a neuromodulator waveform and can respond in a dose-dependent manner, including non-linear state changes that are analogous to action potentials. We show how this metabolic pathway can translate information about external stimuli to production profiles of energy-carrying molecules such as lactate with a precision beyond simple signal transduction or non-linear amplification. The results suggest the operation of a metabolic state-machine from the spatially discontiguous yet interdependent metabolite elements. Such metabolic pathways might be well-positioned to code an additional level of salient information about a cell’s environmental demands to impact its function. Our hypothesis has implications for the computational power and energy efficiency of the brain.
Article
Neuromorphic computing, composed of artificial synapses and neural network algorithms, is expected to replace the traditional von Neumann computer architecture to build the next-generation intelligent systems due to its more energy-efficient features. In this work, the flexible Au/WOx/Pt/Mica memristor with simple structure is fabricated by RF magnetron sputtering, and the highly adjustable resistance states, the function of biological synapses and neurons in different states, such as short/long-term plasticity, paired-pulse facilitation, and spike-time-dependent plasticity, were demonstrated in flexible WOx memristor. Furthermore, we established a convolutional neural networks (CNNs) architecture for the Mixed National Institute of Standards and Technology (MNIST) pattern categorization and demonstrated that the recognition performance is comparable to that of a software-based neural network. These results provide a feasible approach for the realization of flexible neuromorphic computing systems in the future.
Article
Full-text available
Nonlinear, multiplication-like operations carried out by individual nerve cells greatly enhance the computational power of a neural system, but our understanding of their biophysical implementation is scant. Here we pursue this problem in the Drosophila melanogaster ON motion vision circuit, in which we record the membrane potentials of direction-selective T4 neurons and of their columnar input elements in response to visual and pharmacological stimuli in vivo. Our electrophysiological measurements and conductance-based simulations provide evidence for a passive supralinear interaction between two distinct types of synapse on T4 dendrites. We show that this multiplication-like nonlinearity arises from the coincidence of cholinergic excitation and release from glutamatergic inhibition. The latter depends on the expression of the glutamate-gated chloride channel GluClα in T4 neurons, which sharpens the directional tuning of the cells and shapes the optomotor behaviour of the animals. Interacting pairs of shunting inhibitory and excitatory synapses have long been postulated as an analogue approximation of a multiplication, which is integral to theories of motion detection, sound localization and sensorimotor control.
Article
1. Compartmental modeling experiments were carried out in an anatomically characterized neocortical pyramidal cell to study the integrative behavior of a complex dendritic tree containing active membrane mechanisms. Building on a previously presented hypothesis, this work provides further support for a novel principle of dendritic information processing that could underlie a capacity for nonlinear pattern discrimination and/or sensory processing within the dendritic trees of individual nerve cells. 2. It was previously demonstrated that when excitatory synaptic input to a pyramidal cell is dominated by voltage-dependent N-methyl-D-aspartate (NMDA)-type channels, the cell responds more strongly when synaptic drive is concentrated within several dendritic regions than when it is delivered diffusely across the dendritic arbor. This effect, called dendritic "cluster sensitivity," persisted under wide-ranging parameter variations and directly implicated the spatial ordering of afferent synaptic connections onto the dendritic tree as an important determinant of neuronal response selectivity. 3. In this work, the sensitivity of neocortical dendrites to spatially clustered synaptic drive has been further studied with fast sodium and slow calcium spiking mechanisms present in the dendritic membrane. Several spatial distributions of the dendritic spiking mechanisms were tested with and without NMDA synapses. Results of numerous simulations reveal that dendritic cluster sensitivity is a highly robust phenomenon in dendrites containing a sufficiency of excitatory membrane mechanisms and is only weakly dependent on their detailed spatial distribution, peak conductances, or kinetics. Factors that either work against or make irrelevant the dendritic cluster sensitivity effect include 1) very high-resistance spine necks, 2) very large synaptic conductances, 3) very high baseline levels of synaptic activity, and 4) large fluctuations in level of synaptic activity on short time scales. 4. The functional significance of dendritic cluster sensitivity has been previously discussed in the context of associative learning and memory. Here it is demonstrated that the dendritic tree of a cluster-sensitive neuron implements an approximative spatial correlation, or sum of products operation, such as that which could underlie nonlinear disparity tuning in binocular visual neurons.
Article
Simulations of a morphologically reconstructed cortical pyramidal cell suggest that the long, thin, distal dendrites of such a cell may be ideally suited for nonlinear coincidence-detection at time-scales much faster than the membrane time-constant. In the presence of dendritic sodium spiking conductances, such hypothetical computations might occur by two distinct mechanisms. In one mechanism, fast excitatory synaptic currents inside a thin dendrite create strong local depolarizations, whose repolarization—resulting from charge equalization—can be 100-fold faster than the membrane time-constant; two such potentials in exact coincidence might initiate a dendritic spike. In the alternate mechanism, dendritic sodium spikes which do not fire the soma nonetheless create somatic voltage pulses of millisecond width and a few millivolts amplitude. The soma may fire upon the exact coincidence of several of these dendritic spikes, while their strong delayed-rectifier currents prevent the soma from temporally summating them. The average firing rate of a compartmental simulation of this reconstructed cell can be highly sensitive to the precise (submillisecond) arrangement of its inputs; in one simulation, a subtle reorganization of the temporal and spatial distribution of synaptic events can determine whether the cell fires continuously at 200 Hz or not at all. The two cellular properties postulated to create this behavior—fast, strong synaptic currents and spiking conductances in the distal dendrites—are at least consistent with physiological recordings of somatic potentials from single and coincident synaptic events; further measurements are proposed. The amplitudes and decays of these simulated fast EPSPs and dendritic spikes can be quantitatively predicted by approximations based on dendritic properties, intracellular resistance, and transmembrane conductance, without invoking any free parameters. These expressions both illustrate the dominant biophysical mechanisms of these very transient events and also allow extrapolation of the simulation results to nearby parameter ranges without requiring further simulation. The possibility that cortical cells perform temporally precise computations on single spikes touches many issues in cortical processing: computational speed, spiking variability, population coding, pairwise cell correlations, multiplexed information transmission, and the functional role of the dendritic tree.
Article
1. Simultaneous whole-cell voltage and Ca2+ fluorescence measurements were made from the distal apical dendrites and the soma of thick tufted pyramidal neurons in layer 5 of 4-week-old (P28-32) rat neocortex slices to investigate whether activation of distal synaptic inputs can initiate regenerative responses in dendrites. 2. Dual whole-cell voltage recordings from the distal apical trunk and primary tuft branches (540-940 microns distal to the soma) showed that distal synaptic stimulation (upper layer 2) evoking a subthreshold depolarization at the soma could initiate regenerative potentials in distal branches of the apical tuft which were either graded or all-or-none. These regenerative potentials did not propagate actively to the soma and axon. 3. Calcium fluorescence measurements along the apical dendrites indicated that the regenerative potentials were associated with a transient increase in the concentration of intracellular free calcium ([Ca2+]i) restricted to distal dendrites. 4. Cadmium added to the bath solution blocked both the all-or-more dendritic regenerative potentials and local dendritic [Ca2+]i transients evoked by distal dendritic current injection. Thus, the regenerative potentials in distal dendrites represent local Ca2+ action potentials. 5. Initiation of distal Ca2+ action potentials by a synaptic stimulus required coactivation of AMPA- and NMDA-type glutamate receptor channels. 6. It is concluded that in neocortical layer 5 pyramidal neurons of P28-32 animals glutamatergic synaptic inputs to the distal apical dendrites can be amplified via local Ca2+ action potentials which do not reach threshold for axonal AP initiation. As amplification of distal excitatory synaptic input is associated with a localized increase in [Ca2+]i these Ca2+ action potentials could control the synaptic efficacy of the distal cortico-cortical inputs to layer 5 pyramidal neurons.
Article
Neurons maintain their electrical activity patterns despite channel turnover, cell growth, and variable extracellular conditions. A model is presented in which maximal conductances of ionic currents depend on the intracellular concentration of calcium ions and so, indirectly, on activity. Model neurons with activity-dependent maximal conductances modify their conductances to maintain a given behavior when perturbed. Moreover, neurons that are described by identical sets of equations can develop different properties in response to different patterns of presynaptic activity.