Emre Neftci

Emre Neftci
University of California, Irvine | UCI · Department of Cognitive Sciences

PhD

About

80
Publications
15,308
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,675
Citations
Additional affiliations
July 2012 - June 2015
University of California, San Diego
Position
  • PostDoc Position
January 2011 - June 2012
ETH Zurich
Position
  • PostDoc Position
February 2007 - June 2012
University of Zurich
Position
  • PhD Student

Publications

Publications (80)
Preprint
Full-text available
Backpropagation (BP) is the most successful and widely used algorithm in deep learning. However, the computations required by BP are challenging to reconcile with known neurobiology. This difficulty has stimulated interest in more biologically plausible alternatives to BP. One such algorithm is the inference learning algorithm (IL). IL has close co...
Preprint
Although deep Reinforcement Learning (RL) has proven successful in a wide range of tasks, one challenge it faces is interpretability when applied to real-world problems. Saliency maps are frequently used to provide interpretability for deep neural networks. However, in the RL domain, existing saliency map approaches are either computationally expen...
Preprint
Full-text available
Adaptive "life-long" learning at the edge and during online task performance is an aspirational goal of AI research. Neuromorphic hardware implementing Spiking Neural Networks (SNNs) are particularly attractive in this regard, as their real-time, event-based, local computing paradigm makes them suitable for edge implementations and fast learning. H...
Article
Full-text available
Modern computation based on the von Neumann architecture is today a mature cutting-edge science. In the Von Neumann architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer te...
Preprint
Full-text available
The brain is the perfect place to look for inspiration to develop more efficient neural networks. The inner workings of our synapses and neurons provide a glimpse at what the future of deep learning might look like. This paper shows how to apply the lessons learnt from several decades of research in deep learning, gradient descent, backpropagation...
Article
Despite the recent success of deep reinforcement learning (RL), domain adaptation remains an open problem. Although the generalization ability of RL agents is critical for the real-world applicability of Deep RL, zero-shot policy transfer is still a challenging problem since even minor visual changes could make the trained agent completely fail in...
Preprint
Full-text available
Modern computation based on the von Neumann architecture is today a mature cutting-edge science. In this architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer technology is...
Preprint
Full-text available
Predictive coding (PC) is a general theory of cortical function. The local, gradient-based learning rules found in one kind of PC model have recently been shown to closely approximate backpropagation. This finding suggests that this gradient-based PC model may be useful for understanding how the brain solves the credit assignment problem. The model...
Preprint
Full-text available
To achieve the low latency, high throughput, and energy efficiency benefits of Spiking Neural Networks (SNNs), reducing the memory and compute requirements when running on a neuromorphic hardware is an important step. Neuromorphic architecture allows massively parallel computation with variable and local bit-precisions. However, how different bit-p...
Preprint
While commercial mid-air gesture recognition systems have existed for at least a decade, they have not become a widespread method of interacting with machines. This is primarily due to the fact that these systems require rigid, dramatic gestures to be performed for accurate recognition that can be fatiguing and unnatural. The global pandemic has se...
Preprint
Full-text available
Many real-world mission-critical applications require continual online learning from noisy data and real-time decision making with a defined confidence level. Probabilistic models and stochastic neural networks can explicitly handle uncertainty in data and allow adaptive learning-on-the-fly, but their implementation in a low-power substrate remains...
Preprint
Despite the recent success of deep reinforcement learning (RL), domain adaptation remains an open problem. Although the generalization ability of RL agents is critical for the real-world applicability of Deep RL, zero-shot policy transfer is still a challenging problem since even minor visual changes could make the trained agent completely fail in...
Article
Full-text available
We present the Surrogate-gradient Online Error-triggered Learning (SOEL) system for online few-shot learning on neuromorphic processors. The SOEL learning system uses a combination of transfer learning and principles of computational neuroscience and deep learning. We show that partially trained deep s (s) implemented on neuromorphic hardware can r...
Conference Paper
Full-text available
Terrain classification is important for outdoor path planning, mapping, and navigation. We developed a reservoir-based spiking neural network (r-SNN) to classify three terrain types (i.e. grass, dirt, and road) in a botanical garden. It included a recurrent layer and a supervised layer. The input spike trains to the recurrent layer were generated f...
Preprint
Full-text available
We present the Surrogate-gradient Online Error-triggered Learning (SOEL) system for online few-shot learningon neuromorphic processors. The SOEL learning system usesa combination of transfer learning and principles of computa-tional neuroscience and deep learning. We show that partiallytrained deep Spiking Neural Networks (SNNs) implemented onneuro...
Article
Full-text available
A growing body of work underlines striking similarities between biological neural networks and recurrent, binary neural networks. A relatively smaller body of work, however, addresses the similarities between learning dynamics employed in deep artificial neural networks and synaptic plasticity in spiking neural networks. The challenge preventing th...
Preprint
Multiplicative stochasticity such as Dropout improves the robustness and generalizability of deep neural networks. Here, we further demonstrate that always-on multiplicative stochasticity combined with simple threshold neurons are sufficient operations for deep neural networks. We call such models Neural Sampling Machines (NSM). We find that the pr...
Preprint
Full-text available
Recent work suggests that synaptic plasticity dynamics in biological models of neurons and neuromorphic hardware are compatible with gradient-based learning (Neftci_et. al, 19). Gradient-based learning requires iterating several times over a dataset, which is both time-consuming and constrains the training samples to be independently and identicall...
Preprint
Full-text available
Spike-based communication between biological neurons is sparse and unreliable. This enables the brain to process visual information from the eyes efficiently. Taking inspiration from biology, artificial spiking neural networks coupled with silicon retinas attempt to model these computations. Recent findings in machine learning allowed the derivatio...
Conference Paper
Full-text available
Spike-based communication between biological neurons is sparse and unreliable. This enables the brain to process visual information from the eyes efficiently. Taking inspiration from biology, artificial spiking neural networks coupled with silicon retinas attempt to model these computations. Recent findings in machine learning allowed the derivatio...
Article
Neural networks are commonly trained to make predictions through learning algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by gradient backpropagation, is based on Hebb's rule and the contrastive divergence algorithm. It operates in two phases, the free phase, where the data are fed to the network, and a clamped phase, wh...
Preprint
A growing body of work underlines striking similarities between spiking neural networks modeling biological networks and recurrent, binary neural networks. A relatively smaller body of work, however, discuss similarities between learning dynamics employed in deep artificial neural networks and synaptic plasticity in spiking neural networks. The cha...
Poster
Full-text available
Dell et al. (1997) implemented a hierarchical model of three layers (semantic, lexical, and phonological) to explain semantic and phonological speech errors for aphasic patients. Through a series of papers, Dell and his colleagues identify the strength of the connections between the units of the semantic and lexical, and the lexical and phonologica...
Poster
Full-text available
Speech production is a complicated task that is based on the ability to sequence at the phoneme, syllable, and word levels. In 1951, Karl Lashley outlined the problem of serial order in behavior, where he proposed the existence of an underlying parallel representation for the performance of serial behavior. The serial order of the sequence accordin...
Preprint
Full-text available
Neural networks are commonly trained to make predictions through learning algorithms. Contrastive Hebbian learning, which is a powerful rule inspired by gradient backpropagation, is based on Hebb's rule and the contrastive divergence algorithm. It operates in two phases, the forward (or free) phase, where the data are fed to the network, and a back...
Conference Paper
Full-text available
The more experienced a listener is on a subject, the easier it is for him to follow a conversation. Paradoxically, an expert listener may also fail to identify the new information that the speaker provides. The goal of this work is to provide a framework that captures these characteristics of speech perception. Many models of speech perception comp...
Conference Paper
Full-text available
Sensorimotor interaction is critical for motor control including speech [2, 3, 4]. Recently, it has been argued that speech planning at the phonological level involves an internal feedback control mechanism that can detect and correct phonological selection errors prior to overt speech output [1]. Evidence for this claim includes (i) speech error r...
Article
Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware designed to mimic the dynamics and architecture of biological neural networks. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algo...
Article
Full-text available
An ongoing challenge in neuromorphic computing is to devise general and computationally efficient models of inference and learning which are compatible with the spatial and temporal constraints of the brain. One increasingly popular and successful approach is to take inspiration from inference and learning algorithms used in deep neural networks. H...
Article
Full-text available
In neuromorphic circuits, stochasticity in the cortex can be mapped into the synaptic or neuronal components. The hardware emulation of these stochastic neural networks are currently being extensively studied using resistive memories or memristors. The ionic process involved in the underlying switching behavior of the memristive elements is conside...
Conference Paper
In recent years the field of neuromorphic low-power systems gained significant momentum, spurring brain-inspired hardware systems which operate on principles that are fundamentally different from standard digital computers and thereby consume orders of magnitude less power. However, their wider use is still hindered by the lack of algorithms that c...
Conference Paper
Spike-timing-dependent plasticity (STDP) incurs both causal and acausal synaptic weight updates, for negative and positive time differences between pre-synaptic and postsynaptic spike events. For realizing such updates in neuromorphic hardware, current implementations either require forward and reverse lookup access to the synaptic connectivity tab...
Article
Current large scale implementations of deep learning and data mining require thousands of processors, massive amounts of off-chip memory, and consume gigajoules of energy. Emerging memory technologies such as nanoscale two-terminal resistive switching memory devices offer a compact, scalable and low power alternative that permits on-chip co-located...
Article
Full-text available
Spike-timing-dependent plasticity (STDP) incurs both causal and acausal synaptic weight updates, for negative and positive time differences between pre-synaptic and post-synaptic spike events. For realizing such updates in neuromorphic hardware, current implementations either require forward and reverse lookup access to the synaptic connectivity ta...
Conference Paper
We present an approach to constructing a neuromorphic device that responds to language input by producing neuron spikes in proportion to the strength of the appropriate positive or negative emotional response. Specifically, we perform a fine-grained sentiment analysis task with implementations on two different systems: one using conventional spikin...
Article
Full-text available
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines, a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulati...
Article
We present an approach to constructing a neuromorphic device that responds to language input by producing neuron spikes in proportion to the strength of the appropriate positive or negative emotional response. Specifically, we perform a fine-grained sentiment analysis task with implementations on two different systems: one using conventional spikin...
Article
In recent years the field of neuromorphic low-power systems that consume orders of magnitude less power gained significant momentum. However, their wider use is still hindered by the lack of algorithms that can harness the strengths of such architectures. While neuromorphic adaptations of representation learning algorithms are now emerging, efficie...
Article
Full-text available
Inherent variability of chemical sensors makes necessary individual calibration of chemical detection systems. This shortcoming has traditionally limited usability of systems based on Metal Oxide (MOX) sensor arrays and prevented mass-production for some applications. Here, aiming at exploring transfer calibration between electronic nose systems, w...
Article
Full-text available
We often learn and recall long sequences in smaller segments, such as a phone number 858 534 22 30 memorized as four segments. Behavioral experiments suggest that humans and some animals employ this strategy of breaking down cognitive or behavioral sequences into chunks in a wide variety of tasks, but the dynamical principles of how this is achieve...
Article
Full-text available
Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce the Synaptic Sampling Machine (SSM), a stochastic neural network model that uses synaptic unreliability as a means to stochasticity for sampling. Synaptic unreliability plays the dual role...
Article
Neuromorphic engineering aims to design hardware that efficiently mimics neural circuitry and provides the means for emulating and studying neural systems. In this paper, we propose a new memristor-based neuron circuit that uniquely complements the scope of neuron implementations and follows the stochastic spike response model (SRM), which plays a...
Article
Full-text available
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages fr...
Article
Full-text available
The goal of a generative model is to capture the distribution underlying the data, typically through latent variables. After training, these variables are often used as a new representation, more effective than the original features in a variety of learning tasks. However, the representations constructed by contemporary generative models are usuall...
Article
We present a 65k-neuron integrate-and-fire array transceiver (IFAT) for spike-based neural computation with low-power, high-throughput connectivity. The internally analog, externally digital chip is fabricated on a 4×4 mm2 die in 90nm CMOS and arranged in 4 quadrants of 16k parallel addressable neurons. Each neuron circuit serves input spike events...
Article
Full-text available
Neuromorphic hardware offers an electronic substrate for the realization of asynchronous event-based sensory-motor systems and large-scale spiking neural network architectures. In order to characterize these systems, configure them, and carry out modeling experiments, it is often necessary to interface them to workstations. The software used for th...
Article
Full-text available
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages fr...
Conference Paper
Full-text available
Restricted Boltzmann Machines (RBMs) and Deep Belief Networks (DBNs) have been demonstrated to perform efficiently on a variety of applications, such as dimensionality reduction and classification. Implementation of RBMs on neuromorphic platforms, which emulate large-scale networks of spiking neurons, has significant advantages from concurrency and...
Article
Full-text available
The quest to implement intelligent processing in electronic neuromorphic systems lacks methods for achieving reliable behavioral dynamics on substrates of inherently imprecise and noisy neurons. Here we report a solution to this problem that involves first mapping an unreliable hardware layer of spiking silicon neurons into an abstract computationa...