Compass Guest Lecture:
Dr Kamélia Daudel, Postdoctoral researcher
Department of Statistics, University of Oxford
Title: Variational Inference – Foundations and recent advances
Variational Inference methods are optimisation-based methods that have garnered a lot of attention in Bayesian Statistics due to their applicability to high-dimensional Machine Learning problems. The purpose of this course is to give an introduction to Variational Inference methods and to shed light on some recent advances in this field.
In the first part of the course, we will explain the basics of Variational Inference methods and detail Mean-Field Variational Inference. This will enable us to see that Variational Inference methods are often impacted by two factors : (i) an inappropriate choice of the objective function appearing in the optimisation problem and (ii) a search space that is too restrictive to match the target at the end of the optimisation procedure. We will then outline how Variational Inference methods that use the alpha-divergence as a general objective function aim at tackling these two shortcomings.
In the second and third part of the course, we will further delve into Variational Inference methods involving the alpha-divergence by presenting two ways to enlarge the search space beyond the traditional framework used in Variational Inference, namely Infinite-Dimensional Alpha-divergence Variational Inference (Part 2) and Monotonic Alpha-divergence Variational Inference (Part 3). In particular, we will show how these frameworks are constructed theoretically. This will permit us to unravel important connections with gradient-based schemes from the optimisation literature as well as an integrated EM algorithm from the importance sampling literature.
11h-12h: Lecture 1 (Room G.09, Fry Building)
12h-13h: Lunch break
13h-14h: Lecture 2 (Room 2.41, Fry Building)
14h30-15h30: Lecture 3 (Room 2.41, Fry Building)