Penn Arts & Sciences Logo

AMCS Colloquium

Friday, October 16, 2020 - 2:00pm

David Duvenaud

University of Toronto

Location

University of Pennsylvania

via Zoom

The zoom link is: https://upenn.zoom.us/j/91692440723

Abstract: Much real-world data is sampled at irregular intervals, but most time series models require regularly-sampled data. Continuous-time models address this problem, but until now only deterministic models (based on ordinary differential equations) or linear-Gaussian models were efficiently trainable with millions of parameters. We construct a scalable algorithm for computing gradients through samples from stochastic differential equations (SDEs), and for gradient-based stochastic variational inference in function space, all with the use of adaptive black-box SDE solvers.  This allows us to fit a new family of richly-parameterized distributions over time series, in which neural networks can parameterize both dynamics and likelihoods.  We demonstrate these latent SDEs on motion capture data, and provide an open-source PyTorch library for fitting large SDE models.

 

The technical details are in this paper: https://arxiv.org/abs/2001.01328

And the code is available at: https://github.com/google-research/torchsde

 

Bio: David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning.  His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge.  David also co-founded Invenia, an energy forecasting and trading company.