Publications (50)33.02 Total impact

Article: Systems Biology and Addiction
[Show abstract] [Hide abstract]
ABSTRACT: The onset of addiction is marked with drug induced positive experiences that keep being repeated. During that time, adaptation occurs and addiction is stabilized. Interruption of those processes induces polysymptomatic withdrawal syndromes. Abstinence is accompanied by risks of relapse. These features of addiction suggest adaptive brain dynamics with common pathways in complex neuronal networks. Addiction research has used animal models, where some of those phenomena could be reproduced, to find correlates of addictive behavior. The major thrust of those approaches has been on the involvement of genes and proteins. Recently, an enormous amount of data has been obtained by high throughput technologies in these fields. Therefore, (Computational) "Systems Biology" had to be implemented as a new approach in molecular biology and biochemistry. Conceptually, Systems Biology can be understood as a field of theoretical biology that tries to identify patterns in complex data sets and that reconstructs the cell and cellular networks as complex dynamic, selforganizing systems. This approach is embedded in systems science as an interdisciplinary effort to understand complex dynamical systems and belongs to the field of theoretical neuroscience (Computational Neuroscience). Systems biology, in a similar way as computational neuroscience is based on applied mathematics, computerbased computation and experimental simulation. In terms of addiction research, building up "computational molecular systems biology of the (addicted) neuron" could provide a better molecular biological understanding of addiction on the cellular and network level. Some key issues are addressed in this article.  [Show abstract] [Hide abstract]
ABSTRACT: An important issue in the neurosciences is a quantitative description of the relation between sensory stimuli presented to an animal and their representations in the nervous system. A standard technique is the construction of a neural tuning curve, that is, a neuron's average firing rate as a function of some parameter characterizing a family of stimuli. It is unavoidable that some of the response data are erroneously attributed to a cell, e.g., during spike sorting. However, the widely used method of statistical analysis based on the sample mean and leastsquares approximation for the spike count can perform extremely badly if the noise distribution is not exactly normal, which is almost never the case in applications. Here, we present a method for constructing neural tuning curves that is especially suited for cases of high noise and the presence of outliers. Since it is usually not decidable if an outlier is faulty or not we limit the influence of far outlying points rather than try to identify and discard them. In contrast to traditional methods employing a pointbypoint estimation of a tuning curve, we use all measured data from all different stimulus conditions at once in the construction. Given the measured data at only a finite number of stimulus conditions, a robust tuning curve is obtained that approximates the cell's ideal tuning curve optimally in all stimulus conditions with respect to a given distance measure. A measure that assesses the quality of this fitting method with respect to the traditional leastsquares fitting method and to a medianbased fitting method is introduced. The reliability of inference with respect to the encoding accuracy that can be achieved by a population of neurons is demonstrated in both artificially generated and experimentally recorded data from rat primary visual cortex. While the data shown in this paper are responses to orientation stimuli, the method of tuning curve construction is also viable and maintains its optimality properties for the case in which the stimulus is defined on a finite interval. 
Article: Tuning Properties of Noisy Cells with Application to Orientation Selectivity in Rat Visual Cortex.
[Show abstract] [Hide abstract]
ABSTRACT: Common measures for the tuning of cells that are used in the neuroscience literature break down even in the case of moderatelynoisyneurons. For this reason, a considerable proportion of recorded neuronal data remains unconsidered. One reason for the unreliabilityof tuning measures is that leastsquares 2tting of a function for the tuning curve is likelyto give too much in3uence to outliers. We present an algorithm using a rankweighted norm to construct a tuning curve which weighs outlying data less strongly. As a model function for the tuning curve, we take a trigonometric polynomial, whose coe4cients can be determined using a linear approximation. This approach avoids the occurrence of multiple local minima in the optimization process. A test criterion is given to answer the question whether a trigonometric polynomial of lower degree can account for the data. Throughout, we applyour 2ndings to our own experimental data recorded from a population of neurons from area 17 of the rat. c � 2002 Elsevier Science B.V. All rights reserved. 
Article: Complex Dynamics is Abolished in Delayed Recurrent Systems with Distributed Feedback Times.
[Show abstract] [Hide abstract]
ABSTRACT: Feedback systems with a single delay time—as described by delaydifferential equations—are known to exhibit various dynamical behaviors including complex oscillations and chaos. Here we show that the consideration of a broad distribution of delay times instead of a single delay results in a shift of the dynamical bifurcations toward higher parameter values, yielding a larger set of parameters with fixed point behavior or simple oscillatory behavior. We demonstrate similar phenomena in three different systems: neuronal feedback in the hippocampus, white blood cell production, i.e., the MackeyGlass equation, and population dynamics in theoretical ecology. Our results suggest that the observed simplification of the dynamics is independent of the shape of the delay distribution and the precise nature of the feedback. The existence of distributed delay times may yield a mechanism to avoid irregular fluctuations in biological feedback systems. © 2003 Wiley Periodicals, Inc. 
 [Show abstract] [Hide abstract]
ABSTRACT: Interaction delays are ubiquitous in feedback systems due to finite signal conduction times. An example is the hippocampal feedback loop comprising excitatory pyramidal cells and inhibitory basket cells, where delays are introduced through synaptic, dendritic and axonal signal propagation. It is well known that in delayed recurrent systems complex periodic orbits and even chaos may occur. Here we study the case of distributed delays arising from diversity in transmission speed. Through stability considerations and numerical computations we show that feedback with distributed delays yields simpler behavior as compared to the singular delay case: oscillations may have a lower period or even be replaced by steady state behavior. The introduction of diversity in delay times may thus be a strategy to avoid complex and irregular behavior in systems where delayed regulation is unavoidable. 
Article: Recurrent Inhibitory Dynamics: The Role of StateDependent Distributions of Conduction Delay Times
[Show abstract] [Hide abstract]
ABSTRACT: We have formulated and analysed a dynamic model for recurrent inhibition that takes into account the state dependence of the delayed feedback signal (due to the variation in threshold of fibres with their size) and the distribution of these delays (due to the distribution of fibre diameters in the feedback pathway). Using a combination of analytic and numerical tools, we have analysed the behaviour of this model. Depending on the parameter values chosen, as well as the initial preparation of the system, there may be a spectrum of postsynaptic firing dynamics ranging from stable constant values through periodic bursting (limit cycle) behaviour and chaotic firing as well as bistable behaviours. Using detailed parameter estimation for a physiologically motivated example (the CA3basket cellmossy fibre system in the hippocampus), we present some of these numerical behaviours. The numerical results corroborate the results of the analytic characterization of the solutions. Namely, for some parameter values the model has a single stable steady state while for the others there is a bistability in which the eventual behaviour depends on the magnitude of stimulation (the initial function).  [Show abstract] [Hide abstract]
ABSTRACT: The way physics and other parts of science work can be explained in the framework of radical constructivism. However, this constructivist view itself shows that a uniquily accepted epistemology, constructivism or any other, would not be an advantage for the development of science. Unlike physics some parts of science successfully use constructivist concepts inside their theories. Because this is the case particularly in learning theory, constructivist ideas can help to improve physics teaching.  [Show abstract] [Hide abstract]
ABSTRACT: Using extracellular recordings and computational modeling, we study the responses of a population of turtle (Pseudemys scripta elegans) retinal ganglion cells to different motion patterns. The onset of motion of a bright bar is signaled by a rise of the population activity that occurs within less than 100 ms. Correspondingly, more complex stimulus movement patterns are reflected by rapid variations of the firing rate of the retinal ganglion cell population. This behavior is reproduced by a computational model that generates ganglion cell activity from the spatiotemporal stimulus pattern using a Wiener model complemented by a nonlinear contrast gain control feedback loop responsible for the sharp transients in response to motion onset. This study demonstrates that contrast gain control strongly influences the temporal course of retinal population activity, and thereby plays a major role in the formation of a population code for stimulus movement patterns. 
 [Show abstract] [Hide abstract]
ABSTRACT: The encoding accuracy of a population of stochastically spiking neurons is studied for different distributions of their tuning widths. The situation of identical radially symmetric receptive fields for all neurons, which is usually considered in the literature, turns out to be disadvantageous from an informationtheoretic point of view. Both a variability of tuning widths and a fragmentation of the neural population into specialized subpopulations improve the encoding accuracy. 1 Introduction The topic of neuronal tuning properties and their functional significance has focused much attention in the last decades. However, neither empirical findings nor theoretical considerations have yielded a unified picture of optimal neural encoding strategies given a sensory or motor task. More specifically, the question as to whether narrow tuning or broad tuning is advantageous for the representation of a set of stimulus features is still being discussed. Empirically, both situations are e... 
Conference Paper: Postontogenetic shortterm plasticity in the somatosensory system: a neural network model
[Show abstract] [Hide abstract]
ABSTRACT: Both repetitive tactile stimulation applied to the hindpaw of an adult rat and intracortical microstimulation in the primary somatosensory cortex lead to fully reversible plastic changes in the cortical hindpaw representation, which occur within hours. Psychophysical experiments in humans indicate that the plastic changes are accompanied by an increase in tactile resolution. We introduce a biologically oriented neural network to simulate the different experiments. The model uses a local learning rule which accounts for all major effects observed, such as the dynamical changes of receptive field size, the increase of the cortical representation area and the full reversibility of all plastic changes. We also discuss the conditions under which an increase in receptive field size of somatosensory neurons results in a higher sensory resolution 
Conference Paper: Critical and NonCritical Avalanche Behavior in Networks of IntegrateandFire Neurons
[Show abstract] [Hide abstract]
ABSTRACT: We study avalanches of spike activity in fully connected networks of integrateand re neurons which receive purely random input. In contrast to the selforganized critical avalanche behavior in sandpile models, critical and noncritical behavior is found depending on the interaction strength. Avalanche behavior can be readily understood by using combinatorial arguments in phase space. 1. 
Conference Paper: Neural Representation of MultiDimensional Stimuli.
[Show abstract] [Hide abstract]
ABSTRACT: Abstract The encoding accuracy of a population of stochastically spiking neurons is studied for different distributions of their tuning widt hs. The situation of identical radially symmetric receptive fields for all neu rons, which is usually considered in the literature, turns out to be disa dvantageous from an informationtheoretic point of view. Both a variabi lity of tuning widths and a fragmentation of the neural population into specialized subpopulations improve the encoding accuracy.  [Show abstract] [Hide abstract]
ABSTRACT: We consider a simple dynamical system in three different ways, demonstrating that dynamic entropy behaviour can be radically different depending on the perspective. Namely, the BoltzmannGibbs entropy of the entire (invertible) system may be constant, increasing or decreasing as a function of time. However, by taking a trace of an invertible dynamical system we may either obtain a system in which the entropy is continuously decreasing or an exact (noninvertible) factor may be obtained which shows a global evolution of entropy to a unique equilibrium.  [Show abstract] [Hide abstract]
ABSTRACT: In a first phenomenological part various forms of alcoholism and their classification are summarized. Our aim is to give an explanation of these patterns of alcohol consumption and addiction. Social, psychic, behavioral, and biochemicophysiological mechanisms currently discussed in the literature are briefly considered with respect to their potential for generating the dynamics of alcohol addiction. As central core those mechanisms turn out to be selfenhancing. By formulating these mechanisms in terms of mathematical models a tool is provided to study their consequences in isolation and in qualitative and quantitative detail. The mathematical modeling is put forward in several steps starting with a single (differential) equation for the process of selfenhancement. This process, acting alone, leads to exponential and unbounded increase of the average alcohol consumption. In a second step, inhibitory mechansims are added which result e.g. from negative effects of alcohol drinking. The combination and different relative weights of selfenhancement and inhibition can already explain several types of drinking behavior, in particular those of α, β, γ and δ drinkers according to the classification of Jellinek. Possible transitions from one type to another and also to abstinence become imaginable by discussion of bifurcations on a 'cusp catastrophe". Oscillatory phenomena in drinking behavior, e.g. the "episodic drinker", result if and only if, in a further step, the model is complemented by a second differential equation accounting for the dynamics of an internal state symbolically called "frustration level". More generally this variable may represent any factor which interacts in a circular fashion (feedback loop) with the alcohol consumption. In connection with the presented models, various ways and strategies for the transition from one type of drinking behavior to another (including low level drinking or abstinence) are discussed. 
 [Show abstract] [Hide abstract]
ABSTRACT: In a previous study, we calculated the resolution obtained by a population of overlapping receptive fields, assuming a coarse coding mechanism. The results, which favor large receptive fields, are applied to the visual system of tongueprojecting salamanders. An analytical calculation gives the number of neurons necessary to determine the direction of their prey. Direction localization and distance determination are studied in neural network simulations of the orienting movement an d the tongue projection, respectively. In all cases, large receptive fields are found to be essential to yield a high sensory resolution. The results are in good agreement with anatomical, electrophysiological and behavioral data.  [Show abstract] [Hide abstract]
ABSTRACT: Electrophysiological studies in various sensory systems of different species show that many neurons involved in object localization have large receptive fields. This seems to contradict the high sensory resolution and the behavioral precision observed in localization experiments. Assuming a coarse coding mechanism, the resolution obtained by an ensemble of neurons is analytically calculated as a function of receptive field size. It is shown that particularly large receptive fields yield a high resolution.  [Show abstract] [Hide abstract]
ABSTRACT: umulations basically confirms the classification by Roth. To every neuron group of Roth, there is a corresponding group of the algorithm. Moreover, the algorithm finds two new groups. ZENTRUM F UR KOGNITIONSWISSENSCHAFTEN Universit at Bremen Contents 1 Introduction 1 2 Biological Basis 1 2.1 Results from Behavioral Studies . . . . . . . . . . . . . . . . . . . . . . . 1 2.2 Roth's Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 3 Theory for Classification 2 3.1 FiringRate Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.2 Methods for Classification . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.3 Improvement of the Classification Result . . . . . . . . . . . . . . . . . . 6 3.3.1 Transformation of the Vectors . . . . . . . . . . . . . . . . . . . . 6 3.3.2 Decrease of the Firing Rate at Multiple Measurements . . . . . . 7 3.3.3 Selection of Neurons . . . . . . . . . . .
Publication Stats
258  Citations  
33.02  Total Impact Points  
Top Journals
Institutions

19852009

Universität Bremen
 Center of Cognitive Sciences
Bremen, Bremen, Germany


1999

Trinity College Dublin
 Department of Computer Science
Dublin, L, Ireland
