Article
Online Variational Inference for StateSpace Models with PointProcess Observations
Department of Automatic Control and Systems Engineering, University of Sheffield, U.K.
Neural Computation (Impact Factor: 2.21). 08/2011; 23(8):196799. DOI: 10.1162/NECO_a_00156 Source: PubMed
ABSTRACT
We present a variational Bayesian (VB) approach for the state and parameter inference of a statespace model with pointprocess observations, a physiologically plausible model for signal processing of spike data. We also give the derivation of a variational smoother, as well as an efficient online filtering algorithm, which can also be used to track changes in physiological parameters. The methods are assessed on simulated data, and results are compared to expectationmaximization, as well as Monte Carlo estimation techniques, in order to evaluate the accuracy of the proposed approach. The VB filter is further assessed on a data set of tasteresponse neural cells, showing that the proposed approach can effectively capture dynamical changes in neural responses in real time.

 "This arises from the fact that parameter β appears in the likelihood only via the product β c x k , and the term α multiplies a binary stimulus that is nonzero only at sparse points in time. This makes α and β difficult to estimate, as Smith and Brown (2003) and Zammit Mangion et al. (2011) noted. In practice, we fix β c and σ 2 ε to ensure a strong, identifiable model, as with previous work. "
[Show abstract] [Hide abstract]
ABSTRACT: This letter considers how a number of modern Markov chain Monte Carlo (MCMC) methods can be applied for parameter estimation and inference in statespace models with point process observations. We quantified the efficiencies of these MCMC methods on synthetic data, and our results suggest that the Reimannian manifold Hamiltonian Monte Carlo method offers the best performance. We further compared such a method with a previously tested variational Bayes method on two experimental data sets. Results indicate similar performance on the large data sets and superior performance on small ones. The work offers an extensive suite of MCMC algorithms evaluated on an important class of models for physiological signal analysis. 
 "sation method in the Mstep implemented in Matlab. Alternative methods based on variational approximations or MCMCsampling have been reported to be more costly than LaplaceEM [13] [24]. "
[Show abstract] [Hide abstract]
ABSTRACT: Latent linear dynamical systems with generalisedlinear observation models arise in a variety of applications, for instance when modelling the spiking activity of populations of neurons. Here, we show how spectral learning methods (usually called subspace identification in this context) for linear systems with linearGaussian observations can be extended to estimate the parameters of a generalisedlinear dynamical system model despite a nonlinear and nonGaussian observation process. We use this approach to obtain estimates of parameters for a dynamical model of neural population data, where the observed spikecounts are Poissondistributed with logrates determined by the latent dynamical process, possibly driven by external inputs. We show that the extended subspace identification algorithm is consistent and accurately recovers the correct parameters on large simulated data sets with a single calculation, avoiding the costly iterative computation of approximate expectationmaximisation (EM). Even on smaller data sets, it provides an effective initialisation for EM, avoiding local optima and speeding convergence. These benefits are shown to extend to real neural data.  [Show abstract] [Hide abstract]
ABSTRACT: Measuring agreement between a statistical model and a spike train data series, that is, evaluating goodness of fit, is crucial for establishing the model's validity prior to using it to make inferences about a particular neural system. Assessing goodnessoffit is a challenging problem for point process neural spike train models, especially for histogrambased models such as perstimulus time histograms (PSTH) and rate functions estimated by spike train smoothing. The timerescaling theorem is a wellknown result in probability theory, which states that any point process with an integrable conditional intensity function may be transformed into a Poisson process with unit rate. We describe how the theorem may be used to develop goodnessoffit tests for both parametric and histogrambased point process models of neural spike trains. We apply these tests in two examples: a comparison of PSTH, inhomogeneous Poisson, and inhomogeneous Markov interval models of neural spike trains from the supplementary eye field of a macque monkey and a comparison of temporal and spatial smoothers, inhomogeneous Poisson, inhomogeneous gamma, and inhomogeneous inverse gaussian models of rat hippocampal place cell spiking activity. To help make the logic behind the timerescaling theorem more accessible to researchers in neuroscience, we present a proof using only elementary probability theory arguments. We also show how the theorem may be used to simulate a general point process model of a spike train. Our paradigm makes it possible to compare parametric and histogrambased neural spike train models directly. These results suggest that the timerescaling theorem can be a valuable tool for neural spike train data analysis.
Similar Publications
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.