Henry D. I. Abarbanel

University of California, San Diego, San Diego, California, United States

Are you Henry D. I. Abarbanel?

Claim your profile

Publications (190)546.23 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: Cardiac rhythm management devices provide therapies for both arrhythmias and resynchronization but not heart failure, which affects millions of patients worldwide. This paper reviews recent advances in biophysics and mathematical engineering that provide a novel technological platform for addressing heart disease and enabling beat-to-beat adaptation of cardiac pacing in response to physiological feedback. The technology consists of silicon hardware central pattern generators (hCPG) that may be trained to emulate accurately the dynamical response of biological central pattern generators (bCPG). We discuss the limitations of present CPGs and appraise the advantages of analogue over digital circuits for application in bioelectronic medicine. To test the system, we have focused on the cardio-respiratory oscillators in the medulla oblongata that modulate heart rate in phase with respiration to induce respiratory sinus arrhythmia (RSA). We describe here a novel, scalable hCPG comprising physiologically realistic (Hodgkin-Huxley type) neurones and synapses. Our hCPG comprises two neurones that antagonise each other to provide rhythmic motor drive to the vagus nerve to slow the heart. We show how recent advances in modelling allow the motor output to adapt to physiological feedback such as respiration. In rats, we report on the restoration of RSA using an hCPG that receives diaphragmatic electromyography input and use it to stimulate the vagus nerve at specific time points of the respiratory cycle to slow the heart rate. We have validated the adaptation of stimulation to alterations in respiratory rate. We demonstrate that the hCPG is tuneable in terms of the depth and timing of the RSA relative to respiratory phase. These pioneering studies will now permit an analysis of the physiological role of RSA as well as its any potential therapeutic use in cardiac disease.This article is protected by copyright. All rights reserved
    The Journal of Physiology 11/2014; · 4.38 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Recent results demonstrate techniques for fully quantitative, statistical inference of the dynamics of individual neurons under the Hodgkin-Huxley framework of voltage-gated conductances. Using a variational approximation, this approach has been successfully applied to simulated data from model neurons. Here, we use this method to analyze a population of real neurons recorded in a slice preparation of the zebra finch forebrain nucleus HVC. Our results demonstrate that using only 1,500 ms of voltage recorded while injecting a complex current waveform, we can estimate the values of 12 state variables and 72 parameters in a dynamical model, such that the model accurately predicts the responses of the neuron to novel injected currents. A less complex model produced consistently worse predictions, indicating that the additional currents contribute significantly to the dynamics of these neurons. Preliminary results indicate some differences in the channel complement of the models for different classes of HVC neurons, which accords with expectations from the biology. Whereas the model for each cell is incomplete (representing only the somatic compartment, and likely to be missing classes of channels that the real neurons possess), our approach opens the possibility to investigate in modeling the plausibility of additional classes of channels the cell might possess, thus improving the models over time. These results provide an important foundational basis for building biologically realistic network models, such as the one in HVC that contributes to the process of song production and developmental vocal learning in songbirds.
    Biological Cybernetics 06/2014; · 2.07 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We investigate the dynamics of a conductance-based neuron model coupled to a model of intracellular calcium uptake and release by the endoplasmic reticulum. The intracellular calcium dynamics occur on a time scale that is orders of magnitude slower than voltage spiking behavior. Coupling these mechanisms sets the stage for the appearance of chaotic dynamics, which we observe within certain ranges of model parameter values. We then explore the question of whether one can, using observed voltage data alone, estimate the states and parameters of the voltage plus calcium (V+Ca) dynamics model. We find the answer is negative. Indeed, we show that voltage plus another observed quantity must be known to allow the estimation to be accurate. We show that observing both the voltage time course V(t) and the intracellular Ca time course will permit accurate estimation, and from the estimated model state, accurate prediction after observations are completed. This sets the stage for how one will be able to use a more detailed model of V+Ca dynamics in neuron activity in the analysis of experimental data on individual neurons as well as functional networks in which the nodes (neurons) have these biophysical properties.
    Physical Review E 06/2014; 89(6-1):062714. · 2.31 Impact Factor
  • Source
    Zhe An, Daniel Rey, Henry D. I. Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: Utilizing the information in observations of a complex system to make accurate predictions through a quantitative model when observations are completed at time $T$, requires an accurate estimate of the full state of the model at time $T$. When the number of measurements $L$ at each observation time within the observation window is larger than a sufficient minimum value $L_s$, the impediments in the estimation procedure are removed. As the number of available observations is typically such that $L \ll L_s$, additional information from the observations must be presented to the model. We show how, using the time delays of the measurements at each observation time, one can augment the information transferred from the data to the model, removing the impediments to accurate estimation and permitting dependable prediction. We do this in a core geophysical fluid dynamics model, the shallow water equations, at the heart of numerical weather prediction. The method is quite general, however, and can be utilized in the analysis of a broad spectrum of complex systems where measurements are sparse. When the model of the complex system has errors, the method still enables accurate estimation of the state of the model and thus evaluation of the model errors in a manner separated from uncertainties in the data assimilation procedure.
    05/2014;
  • Zhe An, Daniel Rey, Henry D. I. Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: Utilizing the information in observations of a complex system to make accurate predictions through a quantitative model when observations are completed at time $T$, requires an accurate estimate of the full state of the model at time $T$. When the number of measurements $L$ at each observation time within the observation window is larger than a sufficient minimum value $L_s$, the impediments in the estimation procedure are removed. As the number of available observations is typically such that $L \ll L_s$, additional information from the observations must be presented to the model. We show how, using the time delays of the measurements at each observation time, one can augment the information transferred from the data to the model, removing the impediments to accurate estimation and permitting dependable prediction. We do this in a core geophysical fluid dynamics model, the shallow water equations, at the heart of numerical weather prediction. The method is quite general, however, and can be utilized in the analysis of a broad spectrum of complex systems where measurements are sparse. When the model of the complex system has errors, the method still enables accurate estimation of the state of the model and thus evaluation of the model errors in a manner separated from uncertainties in the data assimilation procedure.
    04/2014;
  • [Show abstract] [Hide abstract]
    ABSTRACT: Estimating the behavior of a network of neurons requires accurate models of the individual neurons along with accurate characterizations of the connections among them. Whereas for a single cell, measurements of the intracellular voltage are technically feasible and sufficient to characterize a useful model of its behavior, making sufficient numbers of simultaneous intracellular measurements to characterize even small networks is infeasible. This paper builds on prior work on single neurons to explore whether knowledge of the time of spiking of neurons in a network, once the nodes (neurons) have been characterized biophysically, can provide enough information to usefully constrain the functional architecture of the network: the existence of synaptic links among neurons and their strength. Using standardized voltage and synaptic gating variable waveforms associated with a spike, we demonstrate that the functional architecture of a small network of model neurons can be established.
    Biological Cybernetics 04/2014; · 2.07 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Transferring information from observations to models of complex systems may meet impediments when the number of observations at any observation time is not sufficient. This is especially so when chaotic behavior is expressed. We show how to use time-delay embedding, familiar from nonlinear dynamics, to provide the information required to obtain accurate state and parameter estimates. Good estimates of parameters and unobserved states are necessary for good predictions of the future state of a model system. This method may be critical in allowing the understanding of prediction in complex systems as varied as nervous systems and weather prediction where insufficient measurements are typical.
    Physics Letters A 02/2014; · 1.63 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Hodgkin-Huxley (HH) models of neuronal membrane dynamics consist of a set of nonlinear differential equations that describe the time-varying conductance of various ion channels. Using observations of voltage alone we show how to estimate the unknown parameters and unobserved state variables of an HH model in the expected circumstance that the measurements are noisy, the model has errors, and the state of the neuron is not known when observations commence. The joint probability distribution of the observed membrane voltage and the unobserved state variables and parameters of these models is a path integral through the model state space. The solution to this integral allows estimation of the parameters and thus a characterization of many biological properties of interest, including channel complement and density, that give rise to a neuron's electrophysiological behavior. This paper describes a method for directly evaluating the path integral using a Monte Carlo numerical approach. This provides estimates not only of the expected values of model parameters but also of their posterior uncertainty. Using test data simulated from neuronal models comprising several common channels, we show that short (<50 ms) intracellular recordings from neurons stimulated with a complex time-varying current yield accurate and precise estimates of the model parameters as well as accurate predictions of the future behavior of the neuron. We also show that this method is robust to errors in model specification, supporting model development for biological preparations in which the channel expression and other biophysical properties of the neurons are not fully known.
    Biological Cybernetics 04/2012; 106(3):155-67. · 2.07 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Neuroscientists often propose detailed computational models to probe the properties of the neural systems they study. With the advent of neuromorphic engineering, there is an increasing number of hardware electronic analogs of biological neural systems being proposed as well. However, for both biological and hardware systems, it is often difficult to estimate the parameters of the model so that they are meaningful to the experimental system under study, especially when these models involve a large number of states and parameters that cannot be simultaneously measured. We have developed a procedure to solve this problem in the context of interacting neural populations using a recently developed dynamic state and parameter estimation (DSPE) technique. This technique uses synchronization as a tool for dynamically coupling experimentally measured data to its corresponding model to determine its parameters and internal state variables. Typically experimental data are obtained from the biological neural system and the model is simulated in software; here we show that this technique is also efficient in validating proposed network models for neuromorphic spike-based very large-scale integration (VLSI) chips and that it is able to systematically extract network parameters such as synaptic weights, time constants, and other variables that are not accessible by direct observation. Our results suggest that this method can become a very useful tool for model-based identification and configuration of neuromorphic multichip VLSI systems.
    Neural Computation 03/2012; 24(7):1669-94. · 1.76 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We present a method for using measurements of membrane voltage in individual neurons to estimate the parameters and states of the voltage-gated ion channels underlying the dynamics of the neuron's behavior. Short injections of a complex time-varying current provide sufficient data to determine the reversal potentials, maximal conductances, and kinetic parameters of a diverse range of channels, representing tens of unknown parameters and many gating variables in a model of the neuron's behavior. These estimates are used to predict the response of the model at times beyond the observation window. This method of [Formula: see text] extends to the general problem of determining model parameters and unobserved state variables from a sparse set of observations, and may be applicable to networks of neurons. We describe an exact formulation of the tasks in nonlinear data assimilation when one has noisy data, errors in the models, and incomplete information about the state of the system when observations commence. This is a high dimensional integral along the path of the model state through the observation window. In this article, a stationary path approximation to this integral, using a variational method, is described and tested employing data generated using neuronal models comprising several common channels with Hodgkin-Huxley dynamics. These numerical experiments reveal a number of practical considerations in designing stimulus currents and in determining model consistency. The tools explored here are computationally efficient and have paths to parallelization that should allow large individual neuron and network problems to be addressed.
    Biological Cybernetics 10/2011; 105(3-4):217-37. · 2.07 Impact Factor
  • Source
    John C. Quinn, Henry D I Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: The answers to data assimilation questions can be expressed as path integrals over all possible state and parameter histories. We show how these path integrals can be evaluated numerically using a Markov Chain Monte Carlo method designed to run in parallel on a Graphics Processing Unit (GPU). We demonstrate the application of the method to an example with a transmembrane voltage time series of a simulated neuron as an input, and using a Hodgkin-Huxley neuron model. By taking advantage of GPU computing, we gain a parallel speedup factor of up to about 300, compared to an equivalent serial computation on a CPU, with performance increasing as the length of the observation time used for data assimilation increases.
    Journal of Computational Physics 03/2011; 230(22). · 2.14 Impact Factor
  • Source
    Henry D. I. Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: In using data assimilation to import information from observations to estimate parameters and state variables of a model, one must assume a distribution for the noise in the measurements and in the model errors. Using the path integral formulation of data assimilation~ cite{abar2009}, we introduce the idea of self consistency of the distribution of stochastic model errors: the distribution of model errors from the path integral with observed data should be consistent with the assumption made in formulating the the path integral. The path integral setting for data assimilation is discussed to provide the setting for the consistency test. Using two examples drawn from the 1996 Lorenz model, for $D = 100$ and for $D = 20$ we show how one can test for this inconsistency with essential no additional effort than that expended in extracting answers to interesting questions from data assimilation itself. \end{abstract}
    12/2010;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Throughout the brain, neurons encode information in fundamental units of spikes. Each spike represents the combined thresholding of synaptic inputs and intrinsic neuronal dynamics. Here, we address a basic question of spike train formation: how do perithreshold synaptic inputs perturb the output of a spiking neuron? We recorded from single entorhinal principal cells in vitro and drove them to spike steadily at ∼5 Hz (theta range) with direct current injection, then used a dynamic-clamp to superimpose strong excitatory conductance inputs at varying rates. Neurons spiked most reliably when the input rate matched the intrinsic neuronal firing rate. We also found a striking tendency of neurons to preserve their rates and coefficients of variation, independently of input rates. As mechanisms for this rate maintenance, we show that the efficacy of the conductance inputs varied with the relationship of input rate to neuronal firing rate, and with the arrival time of the input within the natural period. Using a novel method of spike classification, we developed a minimal Markov model that reproduced the measured statistics of the output spike trains and thus allowed us to identify and compare contributions to the rate maintenance and resonance. We suggest that the strength of rate maintenance may be used as a new categorization scheme for neuronal response and note that individual intrinsic spiking mechanisms may play a significant role in forming the rhythmic spike trains of activated neurons; in the entorhinal cortex, individual pacemakers may dominate production of the regional theta rhythm.
    European Journal of Neuroscience 10/2010; 32(11):1930-9. · 3.75 Impact Factor
  • Source
    John C. Quinn, Henry D.I. Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: The process of transferring information from observations of a dynamical system to estimate the fixed parameters and unobserved states of a system model can be formulated as the evaluation of a discrete-time path integral in model state space. The observations serve as a guiding ‘potential’ working with the dynamical rules of the model to direct system orbits in state space. The path-integral representation permits direct numerical evaluation of the conditional mean path through the state space as well as conditional moments about this mean. Using a Monte Carlo method for selecting paths through state space, we show how these moments can be evaluated and demonstrate in an interesting model system the explicit influence of the role of transfer of information from the observations. We address the question of how many observations are required to estimate the unobserved state variables, and we examine the assumptions of Gaussianity of the underlying conditional probability. Copyright © 2010 Royal Meteorological Society
    Quarterly Journal of the Royal Meteorological Society 09/2010; 136(652):1855 - 1867. · 3.33 Impact Factor
  • Source
    Henry D. I. Abarbanel, Mark Kostuk, William Whartenby
    [Show abstract] [Hide abstract]
    ABSTRACT: In variational formulations of data assimilation, the estimation of parameters or initial state values by a search for a minimum of a cost function can be hindered by the numerous local minima in the dependence of the cost function on those quantities. We argue that this is a result of instability on the synchronization manifold where the observations are required to match the model outputs in the situation where the data and the model are chaotic. The solution to this impediment to estimation is given as controls moving the positive conditional Lyapunov exponents on the synchronization manifold to negative values and adding to the cost function a penalty that drives those controls to zero as a result of the optimization process implementing the assimilation. This is seen as the solution to the proper size of ‘nudging’ terms: they are zero once the estimation has been completed, leaving only the physics of the problem to govern forecasts after the assimilation window.We show how this procedure, called Dynamical State and Parameter Estimation (DSPE), works in the case of the Lorenz96 model with nine dynamical variables. Using DSPE, we are able to accurately estimate the fixed parameter of this model and all of the state variables, observed and unobserved, over an assimilation time interval [0, T]. Using the state variables at T and the estimated fixed parameter, we are able to accurately forecast the state of the model for t > T to those times where the chaotic behaviour of the system interferes with forecast accuracy. Copyright © 2010 Royal Meteorological Society
    Quarterly Journal of the Royal Meteorological Society 04/2010; 136(648):769 - 783. · 3.33 Impact Factor
  • Source
    Henry D. I. Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: Ensemble data assimilation is a problem in determining the most likely phase space trajectory of a model of an observed dynamical sys- tem as it receives inputs from measurements passing information to the model. Using methods developed in statistical physics, we present effective actions and equations of motion for the mean orbits associ- ated with the temporal development of a dynamical model when it has errors, there is uncertainty in its initial state, and it receives informa- tion from measurements. If there are correlations among errors in the measurements they are naturally included in this approach. Comment: 10 pages
    08/2009;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Measures of multiple spike train synchrony are essential in order to study issues such as spike timing reliability, network synchronization, and neuronal coding. These measures can broadly be divided in multivariate measures and averages over bivariate measures. One of the most recent bivariate approaches, the ISI-distance, employs the ratio of instantaneous interspike intervals (ISIs). In this study we propose two extensions of the ISI-distance, the straightforward averaged bivariate ISI-distance and the multivariate ISI-diversity based on the coefficient of variation. Like the original measure these extensions combine many properties desirable in applications to real data. In particular, they are parameter-free, time scale independent, and easy to visualize in a time-resolved manner, as we illustrate with in vitro recordings from a cortical neuron. Using a simulated network of Hindemarsh-Rose neurons as a controlled configuration we compare the performance of our methods in distinguishing different levels of multi-neuron spike train synchrony to the performance of six other previously published measures. We show and explain why the averaged bivariate measures perform better than the multivariate ones and why the multivariate ISI-diversity is the best performer among the multivariate methods. Finally, in a comparison against standard methods that rely on moving window estimates, we use single-unit monkey data to demonstrate the advantages of the instantaneous nature of our methods.
    Journal of neuroscience methods 08/2009; 183(2):287-99. · 2.30 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We examine the use of synchronization as a mechanism for extracting parameter and state information from experimental systems. We focus on important aspects of this problem that have received little attention previously and we explore them using experiments and simulations with the chaotic Colpitts oscillator as an example system. We explore the impact of model imperfection on the ability to extract valid information from an experimental system. We compare two optimization methods: an initial value method and a constrained method. Each of these involves coupling the model equations to the experimental data in order to regularize the chaotic motions on the synchronization manifold. We explore both time-dependent and time-independent coupling and discuss the use of periodic impulse coupling. We also examine both optimized and fixed (or manually adjusted) coupling. For the case of an optimized time-dependent coupling function u(t) we find a robust structure which includes sharp peaks and intervals where it is zero. This structure shows a strong correlation with the location in phase space and appears to depend on noise, imperfections of the model, and the Lyapunov direction vectors. For time-independent coupling we find the counterintuitive result that often the optimal rms error in fitting the model to the data initially increases with coupling strength. Comparison of this result with that obtained using simulated data may provide one measure of model imperfection. The constrained method with time-dependent coupling appears to have benefits in synchronizing long data sets with minimal impact, while the initial value method with time-independent coupling tends to be substantially faster, more flexible, and easier to use. We also describe a method of coupling which is useful for sparse experimental data sets. Our use of the Colpitts oscillator allows us to explore in detail the case of a system with one positive Lyapunov exponent. The methods we explored are easily extended to driven systems such as neurons with time-dependent injected current. They are expected to be of value in nonchaotic systems as well. Software is available on request.
    Physical Review E 08/2009; 80(1 Pt 2):016201. · 2.31 Impact Factor
  • Leif Gibb, Timothy Q Gentner, Henry D I Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: The telencephalic premotor nucleus HVC is situated at a critical point in the pattern-generating premotor circuitry of oscine songbirds. A striking feature of HVC's premotor activity is that its projection neurons burst extremely sparsely. Here we present a computational model of HVC embodying several central hypotheses: 1) sparse bursting is generated in bistable groups of recurrently connected robust nucleus of the arcopallium (RA)-projecting (HVCRA) neurons; 2) inhibitory interneurons terminate bursts in the HVCRA groups; and 3) sparse sequences of bursts are generated by the propagation of waves of bursting activity along networks of HVCRA neurons. Our model of sparse bursting places HVC in the context of central pattern generators and cortical networks using inhibition, recurrent excitation, and bistability. Importantly, the unintuitive result that inhibitory interneurons can precisely terminate the bursts of HVCRA groups while showing relatively sustained activity throughout the song is made possible by a specific constraint on their connectivity. We use the model to make novel predictions that can be tested experimentally.
    Journal of Neurophysiology 07/2009; 102(3):1748-62. · 3.30 Impact Factor
  • Leif Gibb, Timothy Q Gentner, Henry D I Abarbanel
    [Show abstract] [Hide abstract]
    ABSTRACT: Uncovering the roles of neural feedback in the brain is an active area of experimental research. In songbirds, the telencephalic premotor nucleus HVC receives neural feedback from both forebrain and brain stem areas. Here we present a computational model of birdsong sequencing that incorporates HVC and associated nuclei and builds on the model of sparse bursting presented in our preceding companion paper. Our model embodies the hypotheses that 1) different networks in HVC control different syllables or notes of birdsong, 2) interneurons in HVC not only participate in sparse bursting but also provide mutual inhibition between networks controlling syllables or notes, and 3) these syllable networks are sequentially excited by neural feedback via the brain stem and the afferent thalamic nucleus Uva, or a similar feedback pathway. We discuss the model's ability to unify physiological, behavioral, and lesion results and we use it to make novel predictions that can be tested experimentally. The model suggests a neural basis for sequence variations, shows that stimulation in the feedback pathway may have different effects depending on the balance of excitation and inhibition at the input to HVC from Uva, and predicts deviations from uniform expansion of syllables and gaps during HVC cooling.
    Journal of Neurophysiology 07/2009; 102(3):1763-78. · 3.30 Impact Factor

Publication Stats

7k Citations
546.23 Total Impact Points

Institutions

  • 1970–2014
    • University of California, San Diego
      • • Department of Physics
      • • Institute for Nonlinear Science (INLS)
      San Diego, California, United States
  • 2012
    • ETH Zurich
      • Institute of Neuroinformatics
      Zürich, ZH, Switzerland
  • 1989–2010
    • CSU Mentor
      Long Beach, California, United States
  • 2008–2009
    • The Scripps Research Institute
      La Jolla, California, United States
  • 1987–2009
    • National University (California)
      San Diego, California, United States
  • 2007
    • Max Planck Institute of Neurobiology
      München, Bavaria, Germany
  • 2006
    • University of Freiburg
      • Center for Data Analysis and Modeling (FDM)
      Freiburg, Lower Saxony, Germany
  • 2005
    • Freie Universität Berlin
      • Institute of Biology
      Berlin, Land Berlin, Germany
  • 2004
    • Hungarian Academy of Sciences
      Budapeŝto, Budapest, Hungary
  • 2001–2003
    • California Institute of Technology
      • • Department of Physics
      • • Division of Biology
      Pasadena, CA, United States
    • Salk Institute
      La Jolla, California, United States
  • 1997
    • University of Great Falls
      Great Falls, Montana, United States
  • 1996
    • University of California, Santa Barbara
      • Department of Physics
      Santa Barbara, California, United States