About
264
Publications
30,009
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
7,384
Citations
Citations since 2017
Introduction
Additional affiliations
January 1989 - present
Publications
Publications (264)
Learning classification tasks of $$({2}^{n}\times {2}^{n})$$ ( 2 n × 2 n ) inputs typically consist of $$\le n(2\times 2$$ ≤ n ( 2 × 2 ) max-pooling (MP) operators along the entire feedforward deep architecture. Here we show, using the CIFAR-10 database, that pooling decisions adjacent to the last convolutional layer significantly enhance accuracie...
We study bi-directional associative neural networks that, exposed to noisy examples of an extensive number of random archetypes, learn the latter (with or without the presence of a teacher) when the supplied information is enough: in this setting, learning is heteroassociative -- involving couples of patterns -- and it is achieved by reverberating...
Deep architectures consist of tens or hundreds of convolutional layers (CLs) that terminate with a few fully connected (FC) layers and an output layer representing the possible labels of a complex classification task. According to the existing deep learning (DL) rationale, the first CL reveals localized features from the raw data, whereas the subse...
The realization of complex classification tasks requires training of deep learning (DL) architectures consisting of tens or even hundreds of convolutional and fully connected hidden layers, which is far from the reality of the human brain. According to the DL rationale, the first convolutional layer reveals localized patterns in the input and large...
Learning classification tasks of (2^nx2^n) inputs typically consist of \le n (2x2) max-pooling (MP) operators along the entire feedforward deep architecture. Here we show, using the CIFAR-10 database, that pooling decisions adjacent to the last convolutional layer significantly enhance accuracy success rates (SRs). In particular, average SRs of the...
Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, currently extended to hundreds, are far from their biological realization. Their implausible biological dynamics relies on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typically large, u...
In neural network's Literature, "Hebbian learning" traditionally refers to the procedure by which the Hopfield model and its generalizations "store" archetypes (i.e., definite patterns that are experienced just once to form the synaptic matrix). However, the term "learning" in Machine Learning refers to the ability of the machine to extract feature...
Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, which are already extended to hundreds, and are far from their biological realization. Their implausible biological dynamics is based on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typi...
Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increa...
The realization of complex classification tasks requires training of deep learning (DL) architectures consisting of tens or even hundreds of convolutional and fully connected hidden layers, which is far from the reality of the human brain. According to the DL rationale, the first convolutional layer reveals localized patterns in the input and large...
Real-time sequence identification is a core use-case of artificial neural networks (ANNs), ranging from recognizing temporal events to identifying verification codes. Existing methods apply recurrent neural networks, which suffer from training difficulties; however, performing this function without feedback loops remains a challenge. Here, we prese...
Synaptic plasticity is a long-lasting core hypothesis of brain learning that suggests local adaptation between two connecting neurons and forms the foundation of machine learning. The main complexity of synaptic plasticity is that synapses and dendrites connect neurons in series and existing experiments cannot pinpoint the significant imprinted ada...
The gap between the huge volumes of data needed to train artificial neural networks and the relatively small amount of data needed by their biological counterparts is a central puzzle in machine learning. Here, inspired by biological information-processing, we introduce a generalized Hopfield network where pairwise couplings between neurons are bui...
Real-time sequence identification is a core use-case of artificial neural networks (ANNs), ranging from recognizing temporal events to identifying verification codes. Existing methods apply recurrent neural networks, which suffer from training difficulties; however, performing this function without feedback loops remains a challenge. Here, we prese...
In neural network's Literature, {\em Hebbian learning} traditionally refers to the procedure by which the Hopfield model and its generalizations {\em store} archetypes (i.e., definite patterns that are experienced just once to form the synaptic matrix). However, the term {\em learning} in Machine Learning refers to the ability of the machine to ext...
Refractoriness is a fundamental property of excitable elements, such as neurons, indicating the probability for re-excitation in a given time lag, and is typically linked to the neuronal hyperpolarization following an evoked spike. Here we measured the refractory periods (RPs) in neuronal cultures and observed that an average anisotropic absolute R...
Refractoriness is a fundamental property of excitable elements, such as neurons, indicating the probability for re-excitation in a given time-lag, and is typically linked to the neuronal hyperpolarization following an evoked spike. Here we measured the refractory periods (RPs) in neuronal cultures and observed that anisotropic absolute RPs could ex...
Refractory periods are an unavoidable feature of excitable elements, resulting in necessary time-lags for re-excitation. Herein, we measure neuronal absolute refractory periods (ARPs) in synaptic blocked neuronal cultures. In so doing, we show that their duration can be significantly extended by dozens of milliseconds using preceding evoked spikes...
Power-law scaling, a central concept in critical phenomena, is found to be useful in deep learning, where optimized test errors on handwritten digit examples converge as a power-law to zero with database size. For rapid decision making with one training epoch, each example is presented only once to the trained network, the power-law exponent increa...
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
Attempting to imitate the brain’s functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; however, experimental neuroscience has not directly advanced the field of machine learning (ML). Here, using neuronal cultures, we demonstrate that increased training frequency accelerates the neuronal adaptation...
Attempting to imitate the brain functionalities, researchers have bridged between neuroscience and artificial intelligence for decades; however, experimental neuroscience has not directly advanced the field of machine learning. Here, using neuronal cultures, we demonstrate that increased training frequency accelerates the neuronal adaptation proces...
Ultrafast physical random bit generation at hundreds of Gb/s rates, with verified randomness, is a crucial ingredient in secure communication and have recently emerged using optics based physical systems. Here we examine the inverse problem and measure the ratio of information bits that can be systematically embedded in a random bit sequence withou...
Recently, deep learning algorithms have outperformed human experts in various tasks across several domains; however, their characteristics are distant from current knowledge of neuroscience. The simulation results of biological learning algorithms presented herein outperform state-of-the-art optimal learning curves in supervised learning of feedfor...
Synchronization of coupled oscillators at the transition between classical physics and quantum physics has become an emerging research topic at the crossroads of nonlinear dynamics and nanophotonics. We study this unexplored field by using quantum dot microlasers as optical oscillators. Operating in the regime of cavity quantum electrodynamics (cQE...
Experimental evidence recently indicated that neural networks can learn in a different manner than was previously assumed, using adaptive nodes instead of adaptive links. Consequently, links to a node undergo the same adaptation, resulting in cooperative nonlinear dynamics with oscillating effective link weights. Here we show that the biological re...
Introduction: rTMS has been proven effective in the treatment of neuropsychiatric conditions, with class A (definite efficacy) evidence for treatment of depression and pain (Lefaucheur et al., 2014). The efficacy in stimulation protocols is, however, quite heterogeneous. Saturation of neuronal firing by HFrTMS without allowing time for recovery may...
Physical models typically assume time-independent interactions, whereas neural networks and machine learning incorporate interactions that function as adjustable parameters. Here we demonstrate a new type of abundant cooperative nonlinear dynamics where learning is attributed solely to the nodes, instead of the network links which their number is s...
Neurons are the computational elements that compose the brain and their fundamental principles of activity are known for decades. According to the long-lasting computational scheme, each neuron sums the incoming electrical signals via its dendrites and when the membrane potential reaches a certain threshold the neuron typically generates a spike to...
Introduction
Repetitive transcranial magnetic stimulation (rTMS) has been found to be a promising noninvasive therapeutic tool for a variety of neuropsychiatric conditions. Therapeutic utility of rTMS was classified to have class A evidence for treatment of depression and chronic pain. In different stimulation protocols intervals have been introduc...
We present an analytical framework that allows the quantitative study of statistical dynamic
properties of networks with adaptive nodes that have memory and is used to examine the emergence
of oscillations in networks with response failures. The frequency of the oscillations was quantitatively
found to increase with the excitability of the nodes an...
Neural networks are composed of neurons and synapses, which are responsible for learning in a slow adaptive dynamical process. Here we experimentally show that neurons act like independent anisotropic multiplex hubs, which relay and mute incoming signals following their input directions. Theoretically, the observed information routing enriches the...
The neuronal response to controlled stimulations in vivo has been classically estimated using a limited number of events. Here we show that hours of high-frequency stimulations and recordings of neurons in vivo reveal previously unknown response phases of neurons in the intact brain. Results indicate that for stimulation frequencies below a critica...
The increasing number of recording electrodes enhances the capability of capturing the network’s cooperative activity, however, using too many monitors might alter the properties of the measured neural network and induce noise. Using a technique that merges simultaneous multi-patch-clamp and multi-electrode array recordings of neural networks in-vi...
Catastrophic failures are complete and sudden collapses in the activity of large networks such as economics, electrical power grids and computer networks, which typically require a manual recovery process. Here we experimentally show that excitatory neural networks are governed by a non-Poissonian reoccurrence of catastrophic failures, where their...
The experimental study of neural networks requires simultaneous measurements
of a massive number of neurons, while monitoring properties of the
connectivity, synaptic strengths and delays. Current technological barriers
make such a mission unachievable. In addition, as a result of the enormous
number of required measurements, the estimated network...
Chaos synchronization has been demonstrated as a useful building block for various tasks in secure communications, including a source of all-electronic ultrafast physical random number generators based on room temperature spontaneous chaotic oscillations in a DC-biased weakly coupled GaAs/Al0.45Ga0.55As semiconductor superlattice (SSL). Here, we ex...
Broadband spontaneous macroscopic neural oscillations are rhythmic cortical
firing which were extensively examined during the last century, however, their
possible origination is still controversial. In this work we show how
macroscopic oscillations emerge in solely excitatory random networks and
without topological constraints. We experimentally a...
Realizations of low firing rates in neural networks usually require globally
balanced distributions among excitatory and inhibitory links, while feasibility
of temporal coding is limited by neuronal millisecond precision. We
experimentally demonstrate {\mu}s precision of neuronal response timings under
low stimulation frequencies, whereas moderate...
Periodic synchronization of activity among neuronal pools has been related to substantial neural processes and information throughput in the neocortical network. However, the mechanisms of generating such periodic synchronization among distributed pools of neurons remain unclear. We hypothesize that to a large extent there is interplay between the...
Consistency and predictability of brain functionalities depend on the reproducible activity of a single neuron. We identify a reproducible non-chaotic neuronal phase where deviations between concave response latency profiles of a single neuron do not increase with the number of stimulations. A chaotic neuronal phase emerges at a transition to conve...
In 1943 McCulloch and Pitts suggested that the brain is composed of reliable logic-gates similar to the logic at the core of today's computers. This framework had a limited impact on neuroscience, since neurons exhibit far richer dynamics. Here we propose a new experimentally corroborated paradigm in which the truth tables of the brain's logic-gate...
We experimentally show that the neuron functions as a precise time integrator, where the accumulated changes in neuronal response latencies, under complex and random stimulation patterns, are solely a function of a global quantity, the average time lag between stimulations. In contrast, momentary leaps in the neuronal response latency follow trends...
A classical view of neural coding relies on temporal firing synchrony among functional groups of neurons, however, the underlying mechanism remains an enigma. Here we experimentally demonstrate a mechanism where time-lags among neuronal spiking leap from several tens of milliseconds to nearly zero-lag synchrony. It also allows sudden leaps out of s...
The synchronization of chaotic lasers and the optical phase synchronization of light originating in multiple coupled lasers have both been extensively studied. However, the interplay between these two phenomena, especially at the network level, is unexplored. Here, we experimentally compare these phenomena by controlling the heterogeneity of the co...
An all-electronic physical random number generator at rates up to 80 Gbit/s is presented, based on weakly coupled GaAs/Ga_{0.55}Al_{0.45}As superlattices operated at room temperature. It is based on large-amplitude, chaotic current oscillations characterized by a bandwidth of several hundred MHz and do not require external feedback or conversion to...
Nonlinear networks with time-delayed couplings may show strong and weak chaos, depending on the scaling of their Lyapunov exponent with the delay time. We study strong and weak chaos for semiconductor lasers, either with time-delayed self-feedback or for small networks. We examine the dependence on the pump current and consider the question of whet...
We propose a new experimentally corroborated paradigm in which the
functionality of the brain's logic-gates depends on the history of their
activity, e.g. an OR-gate that turns into a XOR-gate over time. Our results are
based on an experimental procedure where conditioned stimulations were enforced
on circuits of neurons embedded within a large-sca...