Théo Galy-Fajou

Théo Galy-Fajou
Technische Universität Berlin | TUB · Department of Software Engineering and Theoretical Computer Science

Doctor of Philosophy
Working at PlantingSpace

About

12
Publications
1,946
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
103
Citations
Introduction
Theo Galy-Fajou currently works at the Department of Computer Science, at the Technical University of Berlin with Manfred Opper. Theo does research in Gaussian Processes and Variational infernece. His current project is on different variations of Gaussian Processes

Publications

Publications (12)
Article
Full-text available
Variational inference is a powerful framework, used to approximate intractable posteriors through variational distributions. The de facto standard is to rely on Gaussian variational families, which come with numerous advantages: they are easy to sample from, simple to parametrize, and many expectations are known in closed-form or readily computed b...
Preprint
Full-text available
Gaussian Processes (\textbf{GPs}) are flexible non-parametric models with strong probabilistic interpretation. While being a standard choice for performing inference on time series, GPs have few techniques to work in a streaming setting. \cite{bui2017streaming} developed an efficient variational approach to train online GPs by using sparsity techni...
Preprint
We propose automated augmented conjugate inference, a new inference method for non-conjugate Gaussian processes (GP) models. Our method automatically constructs an auxiliary variable augmentation that renders the GP model conditionally conjugate. Building on the conjugate structure of the augmented model, we develop two inference methods. First, a...
Conference Paper
Full-text available
We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. The new likelihood has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient latent variable augmentation. The augmented model has the advantage that it is conditionally conju...
Preprint
We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. The new likelihood has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient latent variable augmentation. The augmented model has the advantage that it is conditionally conju...
Conference Paper
Full-text available
We present AugmentedGaussianProcesses.jl, a software package for augmented stochastic variational inference (ASVI) for Gaussian process models with non-conjugate likelihood functions. The idea of ASVI is to find an augmentation of the original GP model which renders the model conditionally conjugate and perform inference in the augmented model. We...
Article
Full-text available
We propose an efficient stochastic variational approach to GP classification building on Polya- Gamma data augmentation and inducing points, which is based on closed-form updates of natural gradients. We evaluate the algorithm on real-world datasets containing up to 11 million data points and demonstrate that it is up to three orders of magnitude f...
Conference Paper
Full-text available
This paper proposes a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. This form of likelihood allows for a latent variable augmentation that leads to a conditionally conjugate model and enables efficient variational inference via block coordinate ascent updates. Our experim...
Conference Paper
Full-text available
We propose an efficient stochastic variational approach to Gaussian Process (GP) classification building on Pólya-Gamma data augmentation and inducing points, which is based on closed-form updates of natural gradients. We evaluate the algorithm on real-world datasets containing up to 11 million data points and demonstrate that it is up to two order...
Article
Full-text available
We propose a fast inference method for Bayesian nonlinear support vector machines that leverages stochastic variational inference and inducing points. Our experiments show that the proposed method is faster than competing Bayesian approaches and scales easily to millions of data points. It provides additional features over frequentist competitors s...
Conference Paper
Full-text available
We propose a fast inference method for Bayesian nonlinear support vector machines that leverages stochastic variational inference and inducing points. Our experiments show that the proposed method is faster than competing Bayesian approaches and scales easily to millions of data points. It provides additional features over frequentist competitors s...
Article
Full-text available
We develop a variational inference (VI) scheme for the recently proposed Bayesian kernel support vector machine (SVM) and a stochastic version (SVI) for the linear Bayesian SVM. We compute the SVM's posterior, paving the way to apply attractive Bayesian techniques, as we exemplify in our experiments by means of automated model selection.

Network

Cited By