Lecture on Sequential Monte Carlo Samplers by Nicolas Chopin, ENSAE and Institut Polytechnique de Paris

Compass Seminars

The Compass Seminars will be starting next week with an exciting talk by Yuege Xie, a PhD student at UT Austin. You might wonder why we need another seminar series and how this will differ from the Data Science or Statistics Seminars. You can find out more below.

Short-Term Goals

To start off, the primary audience will be the two cohorts of PhD students in the Compass CDT, although anyone is welcome and, in fact, encouraged to attend. The aim is to take advantage of the current situation and attract speakers whose work is closely related to the students’ research areas. Each speaker will be encouraged not only to provide an exhaustive background on the talk but also to structure the talk as a tutorial, where possible. This practical and workshop-like approach differs from the other two seminars and it has been designed to keep PhD students engaged and allow them to explore different research areas in a more accessible way.

For the last month of TB1, one of the main goals will be to get the series up-and-running by establishing a line-up of speakers that has been suggested by the first cohort of Compass students, so keep an eye out on our calendar!

Long-Term Vision

As the series gains momentum, the vision that we have for this seminar series is to be a platform for PhD students in Statistics, Data Science and Machine Learning across the UK to network with fellow researchers, to get exposed to different research areas in an accessible way, and to participate in collaborative tasks and challenges such as Hackathons and Kaggle-like competitions. By building a strong national community of young data scientists we will be able to attract important industry speakers and professors from around the world.

At the moment, work is underway to expand the audience of this seminar series and we will surely keep you updated! Come join us next Thursday and get a Deliveroo voucher after completing a feedback survey at the end!


JGI event: Data Science Seminars

The Jean Golding Institute runs an annual series of Data Science Seminars

Upcoming seminars (if you are interested in attending you can sign up with Eventbrite using the links below):

DeepMind UK scientist to tutor Compass students

Taylan Cemgil, Research Scientist, DeepMind UK will speak at the Jean Golding Institute Data Seminar Series and an exclusive talk specifically for Compass students on Representation Learning.

Harvard Prof delivers guest lectures to Compass students

We’re are delighted to welcome Pierre Jacob, Associate Professor of Statistics at Harvard University, to the University of Bristol in March.

Pierre will be delivering lectures for the COMPASS CDT students but all staff in the School of Maths are welcome to attend.

Title: Couplings and Monte Carlo

In his lectures, Pierre will cover couplings, total variation and optimal transport. He will describe the use of couplings in Monte Carlo methods, such as coupling from the past, diagnostics of convergence, and bias removal.

This event is sponsored by COMPASS – EPSRC Centre for Doctoral Training in Computational Statistics and Data Science.

Member of French Academy of Sciences presents mini-series of lectures

Eric Moulines from Ecole Polytechnique is visiting University of Bristol and the School of Mathematics in January 2020.  He will present a mini-series of lectures.  

Convex optimization for machine learning

The purpose of this course is to give an introduction to convex optimization and its applications in statistical learning.

In the first part of the course, I will recall the importance of convex optimisation in statistical learning. I will briefly introduce some useful results of convex analysis. I will then analyse gradient descent algorithms for strongly convex and then convex smooth functions. I will take this opportunity to establish some results on complexity lower bounds for such problems. I will show that the gradient descent algorithm is suboptimal and does not reach the optimal possible speed of convergence. I will the present a strategy to accelerate gradient descent algorithms in order to obtain optimal speeds.

In the second part of the course, I will focus on non smooth optimisation problems. I we will introduce the proximal operator of which I will establish some essential properties. I will then study the proximal gradient algorithms and their accelerated versions.

In a third part, I will look at stochastic versions of these algorithms.

The lectures will take place at the following times:

Tuesday 28th January 11:00- 12:00
Thursday 30th January 13:00- 14:00
Friday 31st January 10:00- 11:00

Tokyo research scientist gives series of data science lectures

Pierre Alquier (Research Scientist Riken AIP project, Tokyo) will visit the University of Bristol School of Mathematics from November 25 to December 6 2019.

As a visitor to the Heilbronn Institute he gave a series of data science lectures to Compass students on 27 November 2019

  • Introduction to the variational approach and examples: Mixture models, matrix completions and recommendations, deep learning
  • Theoretical analysis of variational methods

He will also present additional lectures during his visit on areas such as:

  • A Generalization Bound for Online Variational Inference

Mathieu Gerber, Compass Training Co-ordinator commented: “In his lectures Pierre has provided and proved one of the first general result about the validity of variational methods, which are popular tools to approximate high-dimensional posterior distributions”

Skip to toolbar