[Show abstract][Hide abstract] ABSTRACT: Establishing the hippocampal cellular ensemble that represents an animal's environment involves the emergence and disappearance of place fields in specific CA1 pyramidal neurons, and the acquisition of different spatial firing properties across the active population. While such firing flexibility and diversity have been linked to spatial memory, attention and task performance, the cellular and network origin of these place cell features is unknown. Basic integrate-and-fire models of place firing propose that such features result solely from varying inputs to place cells, but recent studies suggest instead that place cells themselves may play an active role through regenerative dendritic events. However, owing to the difficulty of performing functional recordings from place cell dendrites, no direct evidence of regenerative dendritic events exists, leaving any possible connection to place coding unknown. Using multi-plane two-photon calcium imaging of CA1 place cell somata, axons and dendrites in mice navigating a virtual environment, here we show that regenerative dendritic events do exist in place cells of behaving mice, and, surprisingly, their prevalence throughout the arbour is highly spatiotemporally variable. Furthermore, we show that the prevalence of such events predicts the spatial precision and persistence or disappearance of place fields. This suggests that the dynamics of spiking throughout the dendritic arbour may play a key role in forming the hippocampal representation of space.
[Show abstract][Hide abstract] ABSTRACT: Multiple neural and synaptic phenomena take place in the brain. They operate over a broad range of timescales, and the consequences of their interplay are still unclear. In this work, I study a computational model of a recurrent neural network in which two dynamic processes take place: sensory adaptation and synaptic plasticity. Both phenomena are ubiquitous in the brain, but their dynamic interplay has not been investigated. I show that when both processes are included, the neural circuit is able to perform a specific computation: it becomes a generative model for certain distributions of input stimuli. The neural circuit is able to generate spontaneous patterns of activity that reproduce exactly the probability distribution of experienced stimuli. In particular, the landscape of the phase space includes a large number of stable states (attractors) that sample precisely this prior distribution. This work demonstrates that the interplay between distinct dynamical processes gives rise to useful computation, and proposes a framework in which neural circuit models for Bayesian inference may be developed in the future.
[Show abstract][Hide abstract] ABSTRACT: We investigate a mean-field model of interacting synapses on a directed neural network. Our interest lies in the slow adaptive dynamics of synapses, which are driven by the fast dynamics of the neurons they connect. Cooperation is modeled from the usual Hebbian perspective, while competition is modeled by an original polarity-driven rule. The emergence of a critical manifold culminating in a tricritical point is crucially dependent on the presence of synaptic competition. This leads to a universal 1/t power-law relaxation of the mean synaptic strength along the critical manifold and an equally universal 1/√t relaxation at the tricritical point, to be contrasted with the exponential relaxation that is otherwise generic. In turn, this leads to the natural emergence of long- and short-term memory from different parts of parameter space in a synaptic network, which is the most original and important result of our present investigation
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.