New Directions for Stochastic Differential Equations and Machine Learning

Home > Workshops > 2024 > New Directions for Stochastic Differential Equations and Machine Learning

New Directions for Stochastic Differential Equations and Machine Learning

 03 - 07 Jun 2024
0900 BST

ICMS, Bayes Centre, Edinburgh

Scientific Organisers

  • Neill Campbell, University of Bath
  • James Foster (Lead organiser), University of Bath
  • Tony Shardlow, University of Bath
  • Kartic Subr, University of Edinburgh
  • Yue Wu, University of Strathclyde

About:

In recent years, the field of machine learning (ML) has seen tremendous progress, with many breakthroughs directly connected to the well-studied mathematical theory of Stochastic Differential Equations (SDEs). This increasingly fruitful relationship between SDEs and ML has produced several state-of-the-art innovations, ranging from Langevin algorithms in Bayesian learning to score-based diffusion models in computer vision.

This workshop aimed to bring the SDE and ML communities closer together and “sow the seeds” for future interdisciplinary and impactful research. The following general themes were explored:

  • SDE-inspired learning algorithms and architectures

  • Computational or learning-based algorithms for SDEs

  • Theoretical connections between SDEs and machine learning

  • Applications and areas of opportunity between disciplines

 

 

Programme

Monday 3 June 2024
Registration and Refreshments
Welcome and Housekeeping
Mini-course 1 - Desmond Higham, University of Edinburgh Introduction to the Numerical Simulation of SDEs
Lunch
Mini-course 2 - Fabio De Sousa Ribeiro, Imperial College London Demystifying Diffusion Models
Break and Discussion
Mini-course 3 - Andraž Jelinčič, University of Bath Using Diffrax for efficient GPU-accelerated SDE simulation
Tuesday 4 June 2024
Konstantinos Zygalakis, University of Edinburgh Talk Title TBC
Break and Discussion
Tiffany Vlaar, University of Glasgow Constrained and Partitioned Training of Neural Networks
Break and Discussion
Robert Gruhlke, FU Berlin Generative modelling with Tensor Train approximations of Hamilton–Jacobi–Bellman equations
Lunch
Alexander Lobbe, Imperial College London Generative Modelling of Stochastic Parametrisations for Geophysical Fluid Dynamics
Break and Discussion
Yating Liu, Université Paris-Dauphine Application of the optimal quantization and K-means clustering to the simulation of the McKean-Vlasov equation
Break and Discussion
Teo Deveney, University of Bath Closing the ODE-SDE gap in score-based diffusion models through the Fokker-Planck equation
Break and Discussion
Terry Lyons, University of Oxford Talk Title TBC
Break and Discussion
Welcome Reception & Poster Session, hosted at ICMS
Public Lecture, hosted in G.03 (ground floor), Terry Lyons, University of Oxford Signatures of Streams
Wednesday 5 June 2024
Desmond Higham, University of Edinburgh Stability Issues for Diffusion Models in Generative AI
Break and Discussion
Georgios Batzolis, University of Cambridge Variational Diffusion Auto-encoder: Latent Space Extraction from Pre-trained Diffusion Models
Break and Discussion
Thomas Gaskin, University of Cambridge Neural parameter calibration for large-scale systems
Lunch
Mini-course 4 - Grigoris Pavliotis, Imperial College London Langevin-based sampling schemes
Free afternoon , (Guided walk around the city)
Thursday 6 June 2024
Neil Chada, Heriot-Watt University Unbiased Kinetic Langevin Monte Carlo
Break and Discussion
Benedict Leimkuhler, University of Edinburgh Langevin and Adaptive Langevin Algorithms for Sampling and Optimisation in Machine Learning
Break and Discussion
Lionel Riou-Durand, National Institute of Applied Sciences of Rouen Metropolis Adjusted Langevin Trajectories: a robust alternative to Hamiltonian Monte Carlo
Lunch
Josh Williams, STFC Hartree Centre Modelling particle-laden turbulent flows with neural stochastic differential equations
Break and Discussion
Irene Tubikanec, Johannes Kepler University Linz Network inference in a stochastic multi-population neural mass model via approximate Bayesian computation
Break and Discussion
Hao Ni, University College London High Rank Path Development: an approach of learning the filtration of stochastic processes
Break and Discussion
Workshop Dinner, hosted at Blonde Restaurant, 75 St. Leonard’s Street, Edinburgh EH8 7QR
Friday 7 June 2024
Teresa Klatzer, University of Edinburgh Bayesian Computation with Plug and Play Priors for Poisson Inverse Problems
Break and Discussion
Mateusz Majka, Heriot-Watt University Sampling, optimization, SDEs and gradient flows
Break and Discussion
Grigoris Pavliotis, Imperial College London Learning mean field models from data
Lunch and End of Workshop

Sponsors and Funders:

  • Mathematics