• Home
  • Georgios Detorakis
Georgios Detorakis

Georgios Detorakis

PhD

About

42
Publications
6,999
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
506
Citations
Additional affiliations
January 2016 - July 2019
University of California, Irvine
Position
  • PostDoc Position

Publications

Publications (42)
Article
Full-text available
In a previous work, we introduced a computational model of area 3b which is built upon the neural field theory and receives input from a simplified model of the index distal finger pad populated by a random set of touch receptors (Merkell cells). This model has been shown to be able to self-organize following the random stimulation of the finger pa...
Article
Full-text available
We investigate the formation and maintenance of ordered topographic maps in the primary somatosensory cortex as well as the reorganization of representations after sensory deprivation or cortical lesion. We consider both the critical period (postnatal) where representations are shaped and the post-critical period where representations are maintaine...
Article
Full-text available
Several disorders are related to pathological brain oscillations. In the case of Parkinson's disease, sustained low-frequency oscillations (especially in the β-band, 13-30 Hz) correlate with motor symptoms. It is still under debate whether these oscillations are the cause of parkinsonian motor symptoms. The development of techniques enabling select...
Article
Full-text available
Spike-timing-dependent plasticity (STDP) incurs both causal and acausal synaptic weight updates, for negative and positive time differences between pre-synaptic and post-synaptic spike events. For realizing such updates in neuromorphic hardware, current implementations either require forward and reverse lookup access to the synaptic connectivity ta...
Article
Full-text available
Many real-world mission-critical applications require continual online learning from noisy data and real-time decision making with a defined confidence level. Brain-inspired probabilistic models of neural network can explicitly handle the uncertainty in data and allow adaptive learning on the fly. However, their implementation in a compact, low-pow...
Article
Full-text available
Thermoelectric coolers are semiconductor heat pumps that can be used in precision temperature control applications. After designing a thermoelectric temperature control system, the primary challenge is tuning and testing control algorithms. For instance, developing proportional-integral-derivative controllers involves tuning gains until the desired...
Article
We propose a variation of the self-organizing map algorithm by considering the random placement of neurons on a two-dimensional manifold, following a blue noise distribution from which various topologies can be derived. These topologies possess random (but controllable) discontinuities that allow for a more flexible self- organization, especially w...
Preprint
Full-text available
Many real-world mission-critical applications require continual online learning from noisy data and real-time decision making with a defined confidence level. Probabilistic models and stochastic neural networks can explicitly handle uncertainty in data and allow adaptive learning-on-the-fly, but their implementation in a low-power substrate remains...
Article
Full-text available
We provide theoretical conditions guaranteeing that a self-organizing map efficiently develops representations of the input space. The study relies on a neural field model of spatiotemporal activity in area 3b of the primary somatosensory cortex. We rely on Lyapunov’s theory for neural fields to derive theoretical conditions for stability. We verif...
Preprint
Full-text available
We propose a variation of the self organizing map algorithm by considering the random placement of neurons on a two-dimensional manifold, following a blue noise distribution from which various topologies can be derived. These topologies possess random (but controllable) discontinuities that allow for a more flexible self-organization, especially wi...
Preprint
Multiplicative stochasticity such as Dropout improves the robustness and generalizability of deep neural networks. Here, we further demonstrate that always-on multiplicative stochasticity combined with simple threshold neurons are sufficient operations for deep neural networks. We call such models Neural Sampling Machines (NSM). We find that the pr...
Chapter
This chapter addresses the robust stabilization of neuronal populations modeled as delayed neural fields. These models are integro-differential equations representing the spatio-temporal activity of cerebral structures and take into account the non-instantaneous communication between neurons. It is assumed that the stimulation signal impacts only a...
Article
Full-text available
Spike-Timing-Dependent Plasticity (STDP) is a bio-inspired local incremental weight update rule commonly used for online learning in spike-based neuromorphic systems. In STDP, the intensity of long-term potentiation and depression in synaptic efficacy (weight) between neurons is expressed as a function of the relative timing between pre- and post-s...
Article
Neural networks are commonly trained to make predictions through learning algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by gradient backpropagation, is based on Hebb's rule and the contrastive divergence algorithm. It operates in two phases, the free phase, where the data are fed to the network, and a clamped phase, wh...
Article
Full-text available
Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algorithmic framework. As a result, most neuromorphic hardware are trained off-lin...
Preprint
Full-text available
Neural networks are commonly trained to make predictions through learning algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by gradient backpropagation, is based on Hebb's rule and the contrastive divergence algorithm. It operates in two phases, the forward (or free) phase, where the data are fed to the network, and a back...
Article
Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware designed to mimic the dynamics and architecture of biological neural networks. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algo...
Article
Neural fields are integro-differential equations describing spatiotemporal activity of neuronal populations. When considering finite propagation speed of action potentials, neural fields are affected by space-dependent delays. In this paper, we provide conditions under which such dynamics can be robustly stabilized by a proportional feedback acting...
Article
Full-text available
Computer science offers a large set of tools for prototyping, writing, running, testing, validating, sharing and reproducing results, however computational science lags behind. In the best case, authors may provide their source code as a compressed archive and they may feel confident their research is reproducible. But this is not exactly true. Jam...
Article
Full-text available
An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. H...
Conference Paper
Spike-timing-dependent plasticity (STDP) incurs both causal and acausal synaptic weight updates, for negative and positive time differences between pre-synaptic and postsynaptic spike events. For realizing such updates in neuromorphic hardware, current implementations either require forward and reverse lookup access to the synaptic connectivity tab...
Article
Full-text available
The present paper replicates a neuron model for thalamocortical relay neurons, proposed by X-J Wang. The model is conductance-based and takes advantage of the interplay between a T-type calcium current and a non-specific cation sag current and thus, it is able to generate spindle and delta rhythms. Another feature of this model is the presence of a...
Article
Si les boucles impliquant divers circuits neuronaux jouent un rôle majeur dans le fonctionnement du cerveau, certains dysfonctionnements peuvent altérer l’interaction entre certaines structures cérébrales. Bien que les mécanismes précis de génération des symptômes moteurs de la maladie de Parkinson soient encore méconnus, l’une des hypothèses est u...
Article
Full-text available
Neural fields are integro-differential equations that have been extensively used to model spatiotemporal evolution of neocortical areas (see [1] for a detailed review). Time-delayed neural fields have also been a matter of investigation since they take into account axonal delays [2]. On the other hand, time-delay finite dimensional systems have bee...
Article
Full-text available
Extracellular recordings with multi-electrode arrays is one of the basic tools of contemporary neuroscience. These recordings are mostly used to monitor the activities, understood as sequences of emitted action potentials, of many individual neurons. But the raw data produced by extracellular recordings are most commonly a mixture of activities fro...
Article
http://ercim-news.ercim.eu/en97/special/optogenetics-to-unravel-the-mechanisms-of-parkinsonian-symptoms-and-optimize-deep-brain-stimulation
Article
The aim of the present work is the modeling of the formation, maintenance and reorganization of somatosensory cortical maps using the theory of dynamic neural fields. A dynamic neural field is a partial integro-differential equation that is used to model the cortical activity of a part of the cortex. Such a neural field is used in this work in orde...
Thesis
The aim of the present work is the modeling of the formation, maintenance and reorganization of somatosensory cortical maps using the theory of dynamic neural fields. A dynamic neural field is a partial integro-differential equation that is used to model the cortical activity of a part of the cortex. Such a neural field is used in this work in orde...
Article
Full-text available
We investigated the development of topologically orga- nized representations of a restricted region of skin in the primary somatosensory cortex (SI), more precisely, area 3b of SI. We devised a computational model based on the dynamic neural field theory and on an Oja-like learning rule at the level of feed-forward thalamocortical connec- tions [1]...
Chapter
Full-text available
http://link.springer.com/book/10.1007/978-94-007-4792-0/page/1

Network

Cited By