AI & Fundamentals
Neural Stochastic Differential Equations for Irregularly-Sampled Time Series - David Duvenaud, Assistant Professor, U of T

David Duvenaud image

DATE: Fri, December 6, 2019 - 11:30 am

LOCATION: Hugh Dempster Pavilion (DMP) - 110, 6245 Agronomy Road, Vancouver, BC

DETAILS

Abstract:

Much real-world data is sampled at irregular intervals, but most time series models require regularly-sampled data. Continuous-time state-space models can handle address this problem, but until now only linear-Gaussian or deterministic models were efficiently trainable. We construct scalable algorithm for computing gradients of samples from stochastic differential equations, and for gradient-based stochastic variational inference in function space, all with the use of adaptive black-box SDE solvers.  This allows us to fit a new family of richly-parameterized distributions over functions.  We'll show initial results of applying latent SDEs to time series data, and discuss prototypes of infinitely-deep Bayesian neural networks.

 

Bio:


David Duvenaud is an assistant professor in computer science and statistics at the University of Toronto, where he holds a Canada Research Chair in generative models. His postdoctoral research was done at Harvard University, where he worked on hyperparameter optimization, variational inference, and automatic chemical design. He did his Ph.D. at the University of Cambridge, studying Bayesian nonparametrics with Zoubin Ghahramani and Carl Rasmussen. David spent two summers in the machine vision team at Google Research, and also co-founded Invenia, an energy forecasting and trading company. David is a founding member of the Vector Institute and a Faculty Fellow at ElementAI.

 

 

 

Host:

Associate Professor Mark Schmidt, Computer Science, UBC

CAIDA Contact: 

Arynn Keane

arynnk@mail.ubc.ca


< Back to Events