Article

Space-time correlations of neuronal firing related to memory storage capacity

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Most viable theories of memory require some form of synaptic modification dependent on the correlation of pre- and postsynaptic neuronal firing (which we will denote as the Hebb hypothesis). We show here that a possible consequence of this hypothesis is that the storage capacity of a network of highly interconnected neurons is related to the number of synapses and that this implies that the network can be excited into many different time sequence of firing patterns of assemblies of neurons. The important role played by the assembly (as defined by E. R. John) is discussed in detail. A modified Hebb hypothesis is proposed. The crucial experiments to test the model involve the use of two (or more) extracellular microelectrodes to record, simultaneously, the firing activity of several neurons and thus determine the spatial and temporal cross correlations after presenting a mature animal with a variety of stimuli.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... The basic themes and conclusions of our study will hold true for a broad range of models of neuronal function and interaction. However, for the sake of economy of presentation and concreteness of expression, we shall frame our development in terms of the microscopic model of neural network dynamics which has been proposed by Little [8] and explored by Little, Shaw, and other authors [9][10][11][12][13][14][15][16]. ...
... (This is the conventional wisdom; an interesting qualification will be noted in remark 1 of section 8, in connection with the proof of theorem 7.) Following refs. [8][9][10][11][12][13] and [15], we adopt p~(o-~(t)) = {1 + exp[-f3~o~(t)F~(t)]}, (2.2) where p(+ 1) denotes the conditional probability of firing, p(-1) the conditional probability of not firing, at the specified time t, given the states o(tr) of the neurons one time-step earlier and hence the firing function F(t). The nonnegative parameter /3 1 is a measure of the noisy character of signal transmission to neuron ii. ...
... We note that if a neuron is delicately poised between firing and not firing, one or another of these random elements can act to tip the balance. In concert with the point of view expressed by others [20,21,5,[8][9][10][11][12][13], it is the conviction of the present author that these stochastic phenomena are not in general merely useless noise, but rather that they have been exploited by evolution and are somehow beneficial and instrumental to the stable and efficient operation of the brain as an informationprocessing machine par excellence. ...
Article
The dynamical behavior of a large assembly of interconnected neurons is so complex that a statistical approach to its description is imperative. Some formal and conceptual elements of an approach patterned after conventional nonequilibrium statistical mechanics are introduced. The discussion is framed in terms of a model of the microscopic dynamics in which (i) time is quantized in units of a universal delay time τ for signal transmission and (ii) at each time-step t, each neuron makes a stochastic choice of whether or not to fire, biased by the signal it recieves from neurons which fired at time t − τ. The model, operating without external stimulus, defines an aperiodic, irreducible, homogeneous, finite Markov chain. Consequently the dynamical system is ergodic, being characterized by a unique stationary distribution over the available states, which is approached at large times independently of initial conditions. An approximate formulation of the time evolution of the state occupation probabilities is developed in terms of a master equation in continuous time. The nature of time-dependent and steady solutions of this master equation is studied. In accord with the Markov-chain formulation, the steady solution is found to be unique in the physical region. An important question is whether or not this unique solution, the so-called Kirchhoff solution, corresponds to thermodynamic equilibrium, or detailed balance. To answer this equation, and at the same time carry the description to a more global, thermodynamic level, an analogy with a fictitious chemical-kinetic system is established, and a formula for the steady-state entropy production of the neural system is obtained in terms of macroscopic forces and fluxes. The latter variables, which provide the essential elements of the macroscopic description, are determined by the state-transition matrix and by the steady-state occupation probabilities. It is proven by counterexample that in general, neural nets will operate away from thermodynamic equlibrium; detailed balance will strictly prevail only under very special and perhaps artificial situations involving symmetry of the neuronal couplings. Aspects of the formalism are examined explicitly for two- and three-neuron systems.
... e model this would be equivalent to setting different trions to a + or -state se- quentially during one cycle. In the case of the primary auditory cortex, which is tonotopically organized, different modules of cortex each have a characteristic best acoustic frequency, and a tone sequence or melody would sequentially activate these cortical modules. Shaw (1978) predicted that for neurons sampled in primary sensory cortex: (1) single stimuli would give complex responses in time (reverberations), (2) the response magnitude for sequences of input would depend on the order of presentation of inputs (e.g. bars rotating clockwise vs. counterclockwise for visual cortex), and (3) the response magnitud ...
Article
In this paper we examine models of neural coding in the central nervous system at both the cellular and multi-cellular level. The proliferation of neural network models proposed for modeling cognitive and mnemonic capabilities of brains or brain regions suggests the need for neurobiologists to directly test these models. In this paper, we examine assumptions in light of physiological and anatomical constraints of real neurons. We advocate the interaction of neurobiology and modeling efforts to cause these models to evolve.
Chapter
The long-term behavior of neural networks following exposure to external stimuli is central to attempts at modeling brain activity and to the design of physical systems imitating biological mechanisms of memory storage and recall. A set of stimuli all eliciting the same final operating condition of the network defines an equivalence class of experiences which are said to be associated with the same stored memory. A given network may exhibit one, several, or many such attractors, while a given attractor may correspond to a fixed point, a terminal cycle, multiperiodic motion, or chaos. If the dynamical law by which the system updates its state is probabilistic rather than deterministic, one considers an ensemble rather than a particular system. The initial preparation of the system attendant to the imposition of a temporary stimulus is reflected in the specification of an initial probability distribution {p i , (0){ over microscopic system states i. One is then concerned with the behavior of the state occupation probabilities p i (t) at asymptotically large times t.
Chapter
“Top-down” brain theory (based upon functional analysis of cognitive processes) is distinguished from “bottom-up” brain theory (as might be based on analysis of the dynamics of neural nets). “Cooperative computation” is proposed as a way of providing a style of analysis for studying the interactions of neural subsystems at various levels. The section on “Interacting Schemas for Motor Control” provides a top-down analysis of perception and the control of movement in the action-perception cycle. Perceptual “schemas” are introduced as the building blocks for the representation of the perceived environment, and motor schemas serve as control systems to be coordinated into programs for the control of movement. Next, two approaches to the design of machine vision systems are contrasted. The examples exhibit many of the insights to be gained from a top-down analysis but show that such an approach does not guarantee a unique functional analysis of the problem at hand. An algorithm for computing optic flow using the style of cooperative computation is presented. This model has not been validated yet by data from neurophysiology, but seems to be very much “in the style of the brain” and offers interesting insights into the evolution of hierarchical neural structures. Two established neural models that have developed through a rich interaction between theory and experiment are presented. One emphasizes the possible role of the cerebellum in parametric tuning of motor schemas; the other represents interaction between tectum and pretectum in visuomotor coordination in frogs and toads. A connection between these neural models and the top-down analysis of cognition is described. —The Editor
Chapter
The theoretical background of most physiological models of learning and memory rests on the idea that persistent synaptic modifications in neural networks are the basis of information storage. The neurophysiological approaches to the study of learning and memory have thus long been concerned with such guestions as how and under what circumstances do synaptic changes occur in the brain, as well as what could be the nature of the change itself.
Article
The demand for math and science skills in our technology-driven world is at a premium, and yet U.S. students continue to lag behind many other industrialized countries in these areas. This book, based on studies conducted on 8000 elementary school-aged children, proposes that not only is there a relationship between music and math comprehension, but that music can be utilized to heighten higher brain function and improve math skills. The enclosed CD-Rom includes (1) a recording of Allegro con spirito from Sonata for Two Pianos in D Major (K. 448), by Wolfgang Amadeus Mozart, performed by Murray Perahia and Radu Lupu, courtesy of Sony ClassicalTM, and (2) a descriptive interactive version of S.T.A.R.TM (Spatial-Temporal Animation Reasoning) software program. While this books discussion of the breakthroughs in understanding of spatial-temporal reasoning abilities will be of particular interest to neuroscientists and cognitive researchers, the book is also accessible to parents and educators.
Article
In the spirit of Mountcastle’s [1] organizational principle for neocortical function, and strongly motivated by Fisher’s [2] model of physical spin systems, we have introduced [3] a new cooperative mathematical model of the cortical column. Our model incorporates an idealized substructure, the trion, which represents a localized group of neurons. The trion model allows for a completely new framework for information processing and associative memory storage and recall: Small networks of trions with highly symmetric interactions are found to yield hundreds to thousands of quasi-stable, periodic firing patterns, MP’s, which can evolve from one to another (see Fig. 1). Experience or learning would then modify the interactions (away from the symmetric values) and select out the desired MP’s (as in the selection principle of Edelman [4]). Remarkably, we have found that relatively small modifications in trion interaction strengths (away from the symmetric values) via a Hebb-type algorithm [5] will enhance and select out any desired MP. Conceptually this suggests a radically different approach from those information processing models which start at the opposite extreme of a randomly connected neural network with no periodic firing patterns, and then (via Hebb-type modifications [5] in the synaptic interactions) reinforce specific firing patterns. More recently [6], in studying the associative recall properties of the networks we find that, on the average, any of the initial firing configurations rapidly (in 2 to 4 time steps) projects onto an MP.
Article
Previously, Little and Shaw developed a model of memory in neural networks using the Hebb learning hypothesis and explicitly incorporating the known statistical fluctuations at the chemical synapses. They solved exactly the large fluctuation limit of the model and were able to examine the capacity for memory storage. However, the solutions were not physiologically interpretable. We now explicitly introduce assemblies of neurons within the network, and obtain fully interpretable, physiological solutions (again, in the large fluctuation limit) which lead to readily testable predictions.
Article
Brains and computers can both be described as information-processing systems. At the functional level they have a number of properties in common, but at the level of hardware they differ vastly. Relatively little is known of the relation between neural processes and the concepts of cognitive psychology, such as short- and longterm memory, semantic memory, learning, attention etc. There have been several attempts to build models for some aspects of this relation. It is now generally accepted that somehow the hypothesis of neural plasticity, due to Tanzi (1893) and worked out by Hebb (1949), is the basic neural mechanism for learning. Another hypothesis by Hebb, on the formation of cell assemblies (1949), has been largely neglected. There are good arguments to reconsider it. In this paper general considerations are presented on the structure and organization of memory, and the way it comes into existence. They are supported by a restricted simulation experiment that shows formation of cell assemblies. This experiment may be considered as extended replication of a simulation experiment by Rochester et al. (1956). The general considerations are placed within the framework of general systems theory, and are relevant for other self-organizing processes as well, such as the formation of groups in populations, or ideas in a discipline of science.
Article
Previously, we developed a model of short and long term memory which was based on an analogy to the Ising spin system in a neural network. We assumed that the modification of the synaptic strengths was dependent upon the correlation of pre- and post-synaptic neuronal firing. This assumption we denote as the Hebb hypothesis. In this paper, we solve exactly a linearized version of the model and explicitly show that the capacity of the memory is related to the number of synapses rather than the much smaller number of neurons. In addition, we show that in order to utilize this large capacity, the network must store the major part of the information in the capability to generate patterns which evolve with time. We are also led to a modified Hebb hypothesis.
Article
Some of the world's leading researchers in neural networks submitted their most recent results concerning their research in neural networks to the author for inclusion in this survey. Descriptive accounts of their collective papers are presented as well as a list of sources of information concerning neural networks, such as journals, books, and technical reports. The material is broken into categories related to established areas in computer science, robotics, neural modeling, and engineering.
Article
The cognitive correlates literature suggests that a general ability, probably Spearman's g, underlies most information processing/intelligence relationships. In the present paper we suggest that the nature of g is clarified by the following patterns: (a) response consistency has better predictive and convergent validity than does response speed, and (b) tasks which demand dynamic memory processing predict intelligence better than do tasks which require only stimulus encoding and simple stimulus/response translations. Accordingly, g appears related to the ability to flexibly and consistently reconfigure the contents of working memory. A possible physiological basis of this ability is the recruitment of the transient neural assemblies which underly thought (after Hebb, 1949).
Article
The responses of single neurons in the primary and secondary auditory cortex of cat were recorded during the presentation of sequences consisting of five tones of different frequencies. Discharges to tones within these sequences usually (84%) exhibited a dependence on the ‘direction’ of the sequence (ascending, descending, or mixed frequencies). For sequences consisting of 5 tones of identical frequency (monotone) the response often depended on serial position, including cases in which the neuron only responded to later tones in the sequence. Comparison of responses to heterogeneous and monotone sequences showed that response dependence on serial position was a factor in response dependence on sequence direction. Auditory cortical neurons can exhibit stronger responses to a tone presented in a sequence than to the same tone presented alone. Hence, the responses to tones within sequences may not be highly predictable from the responses to isolated tones.
Article
We obtain testable, physiologically interpretable solutions in the high “temperature” limit to a neural network model of memory based on an Ising spin system analogy. The solutions exhibit localization of firing activity and appear to be in striking analogy to magnetic domain phenomena. The theory must now be investigated in the nonlinear, low “temperature” regime where the domain analogy can hold.
Article
It has previously been shown that Hebb learning in a single column in the trion model of cortical organization occurs by selection. Motivated by von Neumann's solution for obtaining reliability and by models of circulating cortical activity, we introduce Hebb intercolumnar couplings to achieve dramatic enhancements in reliability in the firing of connected columns. In order for these enhancements to occur, specific temporal phase differences must exist between the same inherent spatial-temporal memory patterns in connected columns. We then generalize the criteria of large enhancements in the global firing of the entire connected columnar network to investigate the case when different inherent memory patterns are in the columns. The spatial rotations as well as the temporal phases now are crucial. Only certain combinations of inherent memory patterns meet these criteria with the symmetry properties playing a major role. The columnar order of these memory patterns not in the same symmetry family can be extremely important. This yields the first higher-level architecture of a cortical language and grammar within the trion model. The implications of this result with regard to an innate human language and grammar are discussed.
Article
Dendritic bundles have been found throughout the mammalian brain. Unquestionably, these bundles must serve one or more important, fundamental roles in the brain's functioning. However, no physiological experiments to determine their function have been performed on these well-established anatomical units. We survey the numerous anatomical reports of bundling. In addition, we discuss several physiological possibilities for the functional significance of bundles.
Article
Some of the world's leading researchers in neural networks submitted their most recent results concerning their research in neural networks to the author for inclusion in this survey. Descriptive accounts of their collective papers are presented as well as a list of sources of information concerning neural networks, such as journals, books, and technical reports. The material is broken into categories related to established areas in computer science, robotics, neural modeling, and engineering.
Article
We developed a cooperative model of the cortical column incorporating an idealized subunit, the trion (which represents a localized group of neurons), and a discrete time step for firing. We found that networks composed of a small number of trions (with symmetric interactions) supported up to thousands of quasi-stable, periodic firing patterns (MPs) which could be selected out with only small changes in interaction strengths using a Hebb-type algorithm. Here we report a study of the associative recall properties showing striking features: By considering all possible initial firing patterns (for a given set of network connections), we find 1) It takes on the average only 2-5 time steps to recall an MP. 2) Many of the MPs can be individually accessed by thousands of different initial patterns. The variety of examples presented illustrate the rich, general nature of the model.
Article
To study the effect of facilitation exercises using the vestibulo-ocular reflex on ophthalmoplegia due to brainstem injury. A single-baseline design (A-B: A without specific therapy, B with specific therapy) across individual subjects. Inpatient rehabilitation facility. Eight patients with ophthalmoplegia (total of 15 affected muscles) due to brainstem injury. Basic rehabilitative treatment that included physical therapy, occupational therapy and/or speech therapy for impairments such as hemiplegia, ataxia or dysarthria was administered for two weeks (control treatment). Then, two facilitation exercise sessions (100 times/day, five days/week for two weeks) were administered in addition to the basic rehabilitative treatment for four weeks to the eight patients with ophthalmoplegia. Ophthalmoplegia was evaluated at study entry and at the end of each two-week session. The goal of the facilitation exercises is to facilitate voluntary eye movement using conjugated eye movements in the direction opposite to passive movements of the head. To assess ophthalmoplegia we measured the distance between the internal/external corneal margin and the canthus of the affected eye on images recorded on a video tape recorder. After the initial two-week basic rehabilitative treatment, the distance between the corneal margin and canthus decreased slightly. Subsequently, after each of the two facilitation exercise sessions, there were significant reductions in the distance between the corneal margin and canthus compared with that at the beginning of the respective facilitation exercise session. Facilitation exercises significantly improved the horizontal movement of eyes with ophthalmoplegia due to brainstem injury.
Article
Full-text available
The development of neural connectivity in the visual cortex of cats grown in different visual environments is discussed. It is proposed that firing of presynaptic cells in the lateral geniculate nucleus followed by firing of postsynaptic cells in the visual cortex leads to a strengthening of excitatory connections between the presynaptic and postsynaptic cells. A mathematical representation of the dynamics describing this process is developed, and numerical studies are presented for the development of a model system in a number of different environments and initial connectivities. The results are compared with experimental studies of cats grown in altered visual environments.
Article
Full-text available
A nerve net model for the visual cortex of higher vertebrates is presented. A simple learning procedure is shown to be sufficient for the organization of some essential functional properties of single units. The rather special assumptions usually made in the literature regarding preorganization of the visual cortex are thereby avoided. The model consists of 338 neurones forming a sheet analogous to the cortex. The neurones are connected randomly to a retina of 19 cells. Nine different stimuli in the form of light bars were applied. The afferent connections were modified according to a mechanism of synaptic training. After twenty presentations of all the stimuli individual cortical neurones became sensitive to only one orientation. Neurones with the same or similar orientation sensitivity tended to appear in clusters, which are analogous to cortical columns. The system was shown to be insensitive to a background of disturbing input excitations during learning. After learning it was able to repair small defects introduced into the wiring and was relatively insensitive to stimuli not used during training.
Article
Full-text available
Neuronal activity in dorsal hippocampus was recorded in rabbits-during classical conditioning of nictitating membrane response, with tone as conditioned stimulus and corneal air puff as unconditioned stimulus. Unit activity in hippocampus rapidly forms a temporal neuronal "model" of the behavioral response early in training. This hippocampal response does not develop in control animals given unpaired stimuli.
Article
We show that given certain plausible assumptions the existence of persistent states in a neural network can occur only if a certain transfer matrix has degenerate maximum eigenvalues. The existence of such states of persistent order is directly analogous to the existence of long range order in an Ising spin system; while the transition to the state of persistent order is analogous to the transition to the ordered phase of the spin system. It is shown that the persistent state is also characterized by correlations between neurons throughout the brain. It is suggested that these persistent states are associated with short term memory while the eigenvectors of the transfer matrix are a representation of long term memory. A numerical example is given that illustrates certain of these features.
Article
One of the major problems with which neurophysiological research has been confronted, in recent years, is the way in which the brain deals with the information that it receives over the sensory pathways. The study of this problem has been much enhanced by the development of techniques for the recording of the activity of single neurons. The use of such techniques has provided valuable information in two fields of investigation. One field, the electrophysiology of the brain cell: in this respect it has been found that electrical characteristics of cortical and thalamic neurons are very similar to those of neurons located elsewhere in the central nervous system (4, 5, 11, 16, 22), Another field, the study of the patterns of response of single brain cells to peripheral stimulation: in this respect it has been found that the latency of response, the number of spikes and the intervals between spikes generated by each individual neuron are related to the intensity, the fre- quency and the position of the peripheral stimulus (2, 3, 6, 7-10, 15, 17, 18, 23). Thus, cerebral neurons participate in the coding and the displaying of information upon the maps which, in the sensory receiving areas of the brain, represent the spatial, temporal, and other characteristics of events that occur in the outside world. Little is known, however, about the way in which the coded information displayed upon these maps is selected and organized in order to result in the processes which we call perception and integration. Whatever the mechanism of these processes may be, it is evident that, since it involves comparison, selection, and organization of sensory and memory data, it cannot be based on the activity of isolated neurons but it must be based on the interrelated activities of the large number of neurons which, with their connections, form the complex networks present in the brain stem and in the cortex. It is well known that some neurons in these networks are spontaneously
Article
Previously, we developed a model of short and long term memory which was based on an analogy to the Ising spin system in a neural network. We assumed that the modification of the synaptic strengths was dependent upon the correlation of pre- and post-synaptic neuronal firing. This assumption we denote as the Hebb hypothesis. In this paper, we solve exactly a linearized version of the model and explicitly show that the capacity of the memory is related to the number of synapses rather than the much smaller number of neurons. In addition, we show that in order to utilize this large capacity, the network must store the major part of the information in the capability to generate patterns which evolve with time. We are also led to a modified Hebb hypothesis.
Article
A model of a neural system where a group of neurons projects to another group of neurons is discussed. We assume that a trace is the simultaneous pattern of individual activities shown by a group of neurons. We assume synaptic interactions add linearly and that synaptic weights (quantitative measure of degree of coupling between two cells) can be coded in a simple but optimal way where changes in synaptic weight are proportional to the product of pre-and postsynaptic activity at a given time. Then it is shown that this simple system is capable of “memory” in the sense that it can (1) recognize a previously presented trace and (2) if two traces have been associated in the past (that is, if trace f̄ was impressed on the first group of neurons and trace ḡ was impressed on the second group of neurons and synaptic weights coupling the two groups changed according to the above rule) presentation of f̄ to the first group of neurons gives rise to f̄ plus a calculable amount of noise at the second set of neurons. This kind of memory is called an “interactive memory” since distinct stored traces interact in storage. It is shown that this model can effectively perform many functions. Quantitative expressions are derived for the average signal to noise ratio for recognition and one type of association. The selectivity of the system is discussed. References to physiological data are made where appropriate. A sketch of a model of mammalian cerebral cortex which generates an interactive memory is presented and briefly discussed. We identify a trace with the activity of groups of cortical pyramidal cells. Then it is argued that certain plausible assumptions about the properties of the synapses coupling groups of pyramidal cells lead to the generation of an interactive memory.
Article
Recently, Little demonstrated the existence of persistent states in a neural network when a certain transfer matrix has approximately degenerate maximum eigenvalues. He showed the direst analogy of the persistence of the neuronal firing patterns considered in discrete time steps, to the long range apatial order in an Ising spin crystal system. The ordered phase of the spin system occurs below the Curie point temparature. Hence in analogy, a factor representing the temparature of the neural network is assumed in Little's model for the transfer matrix. We derive his transfer matrix for a neural network and explicitly relate this temperature or “smearing” phenomenon to the experimentally observed fluctuations in the postsynaptic potentials.
Article
Gross and neuronal evoked responses, obtained by stimulation with brief flashes of light, were recorded from the lateral geniculate body and the visual cortex of unanesthetized, unrestrained cats. It has been found that in the geniculate as well as in the cortex, the probability of neuronal discharge is highest at the times which correspond to the steepest negative slopes and lowest at the times which correspond to the steepest positive slopes of the gross response. These relations remain unchanged as the magnitude of the gross response is increased by stimuli of higher intensities, or its configuration modified by downward displacement of the recording microelectrode.
Article
We present a theory of short, intermediate and long term memory of a neural network incorporating the known statistical nature of chemical transmission at the synapses. Correlated pre- and post-synaptic facilitation (related to Hebb's Hypothesis) on three time scales are crucial to the model. Considerable facilitation is needed on a short time scale both for establishing short term memory (active persistent firing pattern for the order of a sec) and the recall of intermediate and long term memory (latent capability for a pattern to be re-excited). Longer lasting residual facilitation and plastic changes (of the same nature as the short term changes) provide the mechanism for imprinting of the intermediate and long term memory. We discuss several interesting features of our theory: nonlocal memory storage, large storage capacity, access of memory, single memory mechanism, robustness of the network and statistical reliability, and usefulness of statistical fluctuations.
Article
In contrast to well-studied through-protection neurons that propagate information from one region to another in the central nervous system, short-axon or axonless neurons form local circuits, transmitting signals through synapses and electrical junctions between their dendrites. Interaction in this dendritic network proceeds without spike action potentials. Interaction is mediated by graded electrotonic changes of potential and is transmitted through high sensitivity (submillivolt threshold) synapses rather than by lower sensitivity (20 to 100-mv threshold) synapses typical of projection neurons. A crucial feature of local circuits is their high degree of interaction both through specialized junctional structures and through the extracellular fields generated by local and more distant brain regions. The anatomical evidence for the nature and distribution of neuronal local circuits in the nervous system is surveyed. Bioelectric mechanisms are discussed in relation to the special properties of local circuits, including dendrodendritic synapses, synaptic sensitivity, electrotonic coupling, and field effects. Intraneuronal and interneuronal transport of various types of substances suggests that the biochemical and the bioelectrical parameters are functionally interwoven. Through such interactions neuronal local circuits, with their distinctive properties, may play an essential role in higher brain function.
Article
It is well known that the rhythmic activity of the cerebral cortex is closely associated with a highly organized circulation of neuronal impulses through the networks in which the activity develops. The present study has attempted to determine by quantitative methods, the consistency of this circulation and of its association with the cortical waves, over relatively long periods. Cortical gross waves and neuronal activity have been recorded by means of arrays of extracellular microelectrodes, in freely moving cats as well as in cats immobilized with flaxedil. Autocorrelations have been performed on trains of rhythmic waves and on associated clusters of action potentials; cross-correlations and multiple correlations have been performed between waves and action potentials and between action potentials generated by different groups of neurons. It has been found that the rhythmicity of the waves, the rhythmicity of the associated clusters of action potentials, the time and phase relations between them, and the circulation of neuronal activity through the networks, remain highly consistent over periods as long as two hours.
Article
The terminal distribution of cortico-cortical connections was examined by autoradiography 7–8 days following injections of tritium labeled amino acids into the dorsal bank of the principal sulcus, the posterior part of the medial orbital gyrus, or the hand and arm area of the primary motor cortex in monkeys ranging in age from 4 days to 5.5 months. Labeled axons originating in these various regions of the frontal lobe have topographically diverse ipsilateral and contralateral destinations but virtually all of these projections share a common mode of distribution: they terminate in distinct vertically oriented columns, 200–500 μm wide, that extend across all layers of cortex and alternate in regular sequence with columns of comparable width in which grains do not exceed background. Spatial periodicity in the pattern of transported label in such regions as the prefrontal association cortex, the retro-splenial limbic cortex and the motor cortex indicates that columniation in the intra-cortical distribution of afferent fibers is not unique to sensory specific cortex but is instead a general feature of neocortical organization. A columnar mode of distribution of cortico-cortical projections is present in monkeys at all ages investigated but is especially well delineated in the youngest of them. Thus, grain concentrations within columns are very high in monkeys injected at 4 days of age, somewhat lower in monkeys injected at 39–45 days of age, and least dense in those injected at 5.5 months. The distinctness of the spatially segregated pattern of innervation in the cortex of neonates indicates that the columnar organization of association-fiber systems in the frontal and limbic cortex is achieved before or shortly after birth.
Article
Adult cats were implanted with a movable microelectrode and were trained to perform for food reward in response to diffuse light flicker at two different frequencies. After substantial overtraining, the patterns of cell response (poststimulus histogram) were obtained during generalization trials, using an intermediate frequency stimulus. An average of 29% of the cells examined in lateral geniculate nucleus and visual cortex traverses showed statistically significant differences in the late component of the neuronal response when different responses to the same generalization stimulus were compared.
Article
The ratio of standard deviation to mean, of the amplitude spectrum of well resolved single units, is shown to scale exponentially. This scaling is explained on the basis of a consideration of the relationship of extracellular voltage to voltage gradient, in a dendritic neuron model. A method is then advanced to extract, from multi-unit data, normally distributed amplitude spectra which are indistinguishable from those of well resolved single units. This method is suggested as a means of characterizing, in general, the unit activity obtained from chronically implanted preparations.
Article
Passive modification of the strength of synaptic junctions that results in the construction of internal mappings with some of the properties of memory is shown to lead to the development of Hubel-Wiesel type feature detectors in visual cortex. With such synaptic modification a cortical cell can become committed to an arbitrary but repeated external pattern, and thus fire every time the pattern is presented even if that cell has no genetic pre-disposition to respond to the particular pattern. The additional assumption of lateral inhibition between cortical cells severely limits the number of cells which respond to one pattern as well as the number of patterns that are picked up by a cell. The introduction of a simple neural mapping from the visual field to the lateral geniculate leads to an interaction between patterns which, combined with our assumptions above, seems to lead to a progression of patterns from column to column of the type observed by Hubel and Wiesel in monkey.
Article
A movable microelectrode was implanted in adult cats trained to respond differentially to two different frequencies of light flicker. Unit responses were recorded along cortical and thalamic trajectories. The late components of the poststimulus response of 29% of the cells examined showed statistically significant differences when data from different behavioral outcomes to the same neutral generalization stimulus were compared.
Article
The statistical analysis of two simultaneously observed trains of neuronal spikes is described, using as a conceptual framework the theory of stochastic point processes.The first statistical question that arises is whether the observed trains are independent; statistical techniques for testing independence are developed around the notion that, under the null hypothesis, the times of spike occurrence in one train represent random instants in time with respect to the other. If the null hypothesis is rejected-if dependence is attributed to the trains-the problem then becomes that of characterizing the nature and source of the observed dependencies. Statistical signs of various classes of dependencies, including direct interaction and shared input, are discussed and illustrated through computer simulations of interacting neurons. The effects of nonstationarities on the statistical measures for simultaneous spike trains are also discussed. For two-train comparisons of irregularly discharging nerve cells, moderate nonstationarities are shown to have little effect on the detection of interactions.Combining repetitive stimulation and simultaneous recording of spike trains from two (or more) neurons yields additional clues as to possible modes of interaction among the monitored neurons; the theory presented is illustrated by an application to experimentally obtained data from auditory neurons.A companion paper covers the analysis of single spike trains.
Article
Coupled nonlinear differential equations are derived for the dynamics of spatially localized populations containing both excitatory and inhibitory model neurons. Phase plane methods and numerical solutions are then used to investigate population responses to various types of stimuli. The results obtained show simple and multiple hysteresis phenomena and limit cycle activity. The latter is particularly interesting since the frequency of the limit cycle oscillation is found to be a monotonic function of stimulus intensity. Finally, it is proved that the existence of limit cycle dynamics in response to one class of stimuli implies the existence of multiple stable states and hysteresis in response to a different class of stimuli. The relation between these findings and a number of experiments is discussed.
Nerve net models are designed that can learn up to gamma of alpha n possible sequences of stimuli, where gamma is large but alpha n much larger still. The models proposed store information in modifiable synapses. Their connexions need to be specified only in a general way, a large part being random. They resist destruction of a good many cells. When built with Hebb synapses (or any other class B or C synapses whose modification depends on the conjunction of activities in two cells) they demand a number of inputs to each cell that agrees well with known anatomy. The number of cells required, for performing tasks of the kind considered as well as the human brain can perform them, is only a small fraction of the number of cells in the brain. It is suggested that the models proposed are likely to be the most economical possible for their tasks, components and constructional constraints, and that any others that approach them in economy must share with them certain observable features, in particular an abundance of cells with many independent inputs and low thresholds.
It is proposed that the learning of many tasks by the cerebrum is based on using a very few fundamental techniques for organizing information. It is argued that this is made possible by the prevalence in the world of a particular kind of redundancy, which is characterized by a 'Fundamental Hypothesis'. This hypothesis is used to found a theory of the basic operations which, it is proposed, are carried out by the cerebral neocortex. They involve the use of past experience to form so-called 'classificatory units' with which to interpret subsequent experience. Such classificatory units are imagined to be created whenever either something occurs frequently in the brain's experience, or enough redundancy appears in the form of clusters of slightly differing inputs. A (non-Bayesian) information theoretic account is given of the diagnosis of an input as an instance of an existing classificatory unit, and of the interpretation as such of an incompletely specified input. Neural models are devised to implement the two operations of diagnosis and interpretation, and it is found that the performance of the second is an automatic consequence of the model's ability to perform the first. The discovery and formation of new classificatory units is discussed within the context of these neural models. It is shown how a climbing fibre input (of the kind described by Cajal) to the correct cell can cause that cell to perform a mountain-climbing operation in an underlying probability space, that will lead it to respond to a class of events for which it is appropriate to code. This is called the 'spatial recognizer effect'. The structure of the cerebral neocortex is reviewed in the light of the model which the theory establishes. It is found that many elements in the cortex have a natural identification with elements in the model. This enables many predictions, with specified degrees of firmness, to be made concerning the connexions and synapses of the following cortical cells and fibres: Martinotti cells; cerebral granule cells; pyramidal cells of layers III, V and II; short axon cells of all layers, especially I, IV and VI; cerebral climbing fibres and those cells of the cortex which give rise to them; cerebral basket cells; fusiform cells of layers VI and VII. It is shown that if rather little information about the classificatory units to be formed has been coded genetically, it may be necessary to use a technique called codon formation to organize structure in a suitable way to represent a new unit. It is shown that under certain conditions, it is necessary to carry out a part of this organization during sleep. A prediction is made about the effect of sleep on learning of a certain kind.
Article
Coherent patterns of neural activity reflect the release of memories and may mediate subjective experience.
Article
Single and multiple units, both cells of origin of pyramidal tract (PT) and unidentified units, were studied in relation to the local gross evoked potential to peripheral stimulation in pericruciate association cortex of the chloralosed cat. Both PT and non-PT cells were polymodal, with the PT cells firing later and for a longer duration. PT cells generate a near mirror image of the initial component of the evoked association potential. Spike height analysis on multiple unit data showed that small and large units had the highest probability of firing on the maximum negative slope and maximum negativity, respectively, of the evoked potential.
Article
Anatomical and physiological evidence is cited for the existence in the CNS of more or less discrete populations of interconnected neurons. These are given the term netlets. A model based on these observations is presented, in which it is assumed that the netlets are the fundamental building blocks out of which nets of considerable complexity may be as embled. The connectivity within each netlet is assumed to be random. Neuronal macrostates are defined in which the fractions of neurons active in each netlet are the dynamical variables. Thus the temporal and spatial fine structure of neuronal activity are considered to be of secondary significance and are disregarded.These assumptions bring about an enormous reduction in complexity. Thus calculations and computer simulation studies become possible for systems hitherto inaccessible to quantitative description. It is hoped that the features retained in the model play a sufficiently significant role in the functioning of real neural nets to make these results meaningful.The mathematical formalism and detailed numerical results appear in another paper of this issue (Anninos, 1970). Some of these results are anticipated in this paper and their implications for our model are discussed. The study proceeds from a treatment of isolated probabilistic netlets to the dynamics of interacting netlets. Of particular interest are the conditions under which a netlet will go into sustained activity and the often extremely delicate control exerted by afferent excitatory or inhibitory biases. Hysteresis effects are common and may represent a type of short-term memory.A variety of neural functions are listed to which some of these mechanisms may be applied. Among these are the modulating effects of the brain stem reticular formation on cortical and spinal neuron populations and the “energizing” of cortical centers by spontaneous activity in sensory systems. Finally the concepts of netlet interaction are applied in conjunction with the principle of synaptic facilitation to information processing in the cortex. Examples given are sensory-sensory cortical conditioning and the formation of the classical conditioned reflex.
Article
A study has been made of the behaviour of a set of variable pathways from different inputs and which converge on a single output; input and output signals are all-or-none. For two particular laws the system has been treated analytically and by computer simulation under various input conditions. The two laws are, (1) that the conductivity of a pathway is proportioned to the negative of the Shannon information function between the signals at the input and the output of the pathway and, (2) that the total excitation arriving at the output determines the probability of an output signal. The change in frequency of the binary signals in a pathway is a measure of the change in probability of the quantity associated with that pathway; this f.m. code is the same for inputs and outputs. It has been found that such a system adapts to become a discriminator of the most frequent pattern occurring in the set of input signals during adaptation. Inputs which are statistically independent of the pattern become disconnected from the output. Thereafter, if any new pattern occurs, the change in frequency of the output signals is a measure of the similarity of the new pattern to the learned pattern. The system has been called an informon because its output is a measure of the total information conveyed by all the inputs about the pattern to which the system has adapted; it can form a building brick for a multilayer pattern discrimination network with the ability to classify, associate and predict.
Article
Some possible neural mechanisms of pattern discrimination are discussed, leading to neural networks which can discriminate any number of essentially arbitrarily complicated space-time patterns and activate cells which can then learn and perform any number of essentially arbitrarily complicated space-time patterns in response to the proper input pattern. Among the topics that arise in this discussion are: use of non-recurrent inhibitory interneurons for temporal or spatial discrimination tasks which recurrent inhibitory interneurons cannot carry out; mechanisms of temporal generalization whereby the same cells control performance of a given act at variable speeds; a tendency for cells furthest from the sensory periphery to have the most specific response modes and the least ability to follow sensory intensities (e.g. on-off and bimodal responses are common); uses of non-recurrent on-off fields whose signals arrive in waves forming “interference patterns”, with the net effect of rapidly choosing at most one behavioral mode from any number of competitive modes, or of non-specifically arousing or suppressing cells which can sample and learn ongoing internal patterns; uses of specific vs. non-specific inhibitory interneurons, axon hillock inhibition, presynaptic inhibition, equal smoothing of excitatory and inhibitory signals, possible production of both excitatory and inhibitory transmitter in a single synaptic knob, blockade of postsynaptic potential response, logarithmic transduction of inputs to spiking frequencies, or saturation of cell body response in non-recurrent on-off fields for purposes of pattern discrimination.
Article
These observations indicate that the long-lasting trace of an experience is not completely fixed, consolidated, or coded at the time of the experience. Consolidation requires time, and under at least some circumstances the processes of consolidation appear to be susceptible to a variety of influences- both facilitating and impairing- several hours after the experience. There must be, it seems, more than one kind of memory trace process (31). If permanent memory traces consolidate slowly over time, then other processes must provide a temporary basis for memory while consolidation is occurring. The evidence clearly indicates that trial-to-trial improvement, or learning, in animals cannot be based completely on permanent memory storage. Amnesia can be produced by electroshock and drugs even if the animals are given the treatment long after they have demonstrated "learning" of the task. Of particular interest is the finding that retention of the inhibitory avoidance response increases with time. In a sense this should be expected, for it has long been known (and ignored) that, within limits, learning is facilitated by increasing the interval between repeated trials (7, 30). Our result may be the simplest case of such an effect. Since the improvement in retention with time seemed not to be due solely to consolidation (as indicated by electroshock effects), it would seem that the "distribution of practice" effect, as it is typically designated, may be due in part to a time-dependent temporary memory storage process. In our work with animals we have found no analog of human immediate memory such as that required for repeating digits (or finishing sentences). Animals tested immediately on the task described above after a trial typically showed no evidence of memory. It could be that the poor performance is due to excessive fright, but the "distribution of practice effect" is also typically observed in learning experiments in which food reward is used rather than shock avoidance. Since the retention tasks require the animals to change their behavior in some way, it could well be that the growth of retention over the first few minutes after a trial is due to time dependent processes involved in the organization of processes necessary for changing behavior, in addition to those involved in temporary storage and retrieval. It is worth pointing out that there is evidence of an analogous process in human memory (32). A complex picture of memory storage is emerging. There may be three memory trace systems: one for immediate memory (and not studied in our laboratory); one for short-term memory which develops within a few seconds or minutes and lasts for several hours; and one which consolidates slowly and is relatively permanent. The nature of the durability of the longterm memory trace (that is, the nature and basis of forgetting) is a separate but important issue. There is increasing evidence and speculation (20, 21, 33) that memory storage requires a "tritrace" system, and our findings are at least consistent with such a view. If there are, as seems possible, at least three kinds of traces involved in memory storage, how are they related? Is permanent memory produced by activity of temporary traces (31), or are the trace systems relatively independent? Although available findings do not provide an answer to this question, there does seem to be increasing evidence that the systems are independent. Acquisition can occur, as we have seen, without permanent consolidation, and both short-term and long-term memory increase with time. All this evidence suggests (but obviously does not prove) that each experience triggers activity in each memory system. Each repeated training trial may, according to this view, potentiate short-term processes underlying acquisition while simultaneously enhancing independent underlying long-term consolidation. Obviously, acceptance of these conclusions will require additional research. If this view is substantially correct, it seems clear that any search for the engram or the basis of memory is not going to be successful. Recognition of the possibility that several independent processes may be involved at different stages of memory may help to organize the search. A careful examination of the time course of retention and memory trace consolidation, as well as examination of the bases of the effects of memory-impairing and memory-facilitating treatments, may help to guide the search. It is clear that a complete theory of memory storage must eventually provide an understanding of time-dependent processes in memory. In 1930 Lashley wrote (2), "The facts of both psychology and neurology show a degree of plasticity, of organization, and of adaptation and behavior which is far beyond any present possibility of explanation." Although this conclusion is still valid, the current surge of interest in memory storage offers hope that this conclusion may soon need to be modified.
A simple neural network generating an interactive memory Neuronal substrate of classical conditioning in the hippocampus Nerve net models of plausible size that perform many simple learning tasks Plastic synaptic changes induced by ortho-dromic-antidromic pairing
  • J A Anderson
  • T W Berg-R
  • B Alger
  • R F Thompson
  • G S Brindley
  • E E Decima
Anderson, J. A. A simple neural network generating an interactive memory. Math. Biosci. 14: 197-220, 1972. Berg-r, T. W., B. Alger and R. F. Thompson. Neuronal substrate of classical conditioning in the hippocampus. Science 192: 483-485,1976. Brindley, G. S. Nerve net models of plausible size that perform many simple learning tasks. Proc. R. Sot. 174: 173-191. 1969. Burns, B. D. The Uncertain Nervous System. London: Edward Arnold Publishers, 1968. Decima, E. E. Plastic synaptic changes induced by ortho-dromic-antidromic pairing. In: Mechanism of Synaptic Action, Abstract, p. 14. Jerusalem Symposium, XXVI International Congress of Physiological Sciences, 1974.
Little, W. A. The existence of persistent states in the brain
  • B Katz
  • Nerve
  • Muscle
  • Synapse
Katz, B. Nerve, Muscle and Synapse. New York: McGraw-Hill, 1966. Little, W. A. The existence of persistent states in the brain.
The vectorgraphical characteristics of spontaneous rabbit brain cortex activity. Sechenov physiol Time dependent processes in memory storage
  • R M Mescherskii
  • J L Mcgaugh
Mescherskii, R. M. The vectorgraphical characteristics of spontaneous rabbit brain cortex activity. Sechenov physiol. J. U.S.S.R. (English trans.) 47: 419-426, 1961. McGaugh, J. L. Time dependent processes in memory storage. Science 153: 1351-1358,1966. feature detecting cells in visual cortex. Biol. Cybernetics 19: l-18,1975.
Analysis of neuronal activity in the waking cortex Electrotonic processing of information by brain cells Single cell activity in chronic unit recording: a quantitative study of the unit amplitude spectrum
  • P Rinaldi
  • G Juhaz
  • M Verzeano
  • F O Schmitt
  • P Dev
  • B H Smith
  • E L Schwartz
  • A Ramos
  • E R John
Rinaldi, P., G. Juhaz and M. Verzeano. Analysis of neuronal activity in the waking cortex. Brain Res. Bull. 1: 429-435, 1976. Rosenblatt, F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Washington, DC: Spartan Books, 1961. Schmitt, F. O., P. Dev and B. H. Smith. Electrotonic processing of information by brain cells. Science 193: 114-120, 1976. Schwartz, E. L., A. Ramos and E. R. John. Single cell activity in chronic unit recording: a quantitative study of the unit amplitude spectrum. Brain Res. Bull. 1: 57-68, 1976. Shaw, G. L. and R. Vasudevan. Persistent states of neural networks and the random nature of synaptic transmission. Math. Biosci. 21: 207-218, 1914.
The existence of persistent states in the brain
  • Little
The vectorgraphical characteristics of spontaneous rabbit brain cortex activity
  • Mescherskii