European Summer School in Financial Mathematics 14th edition

Home > Workshops > 2021 > European Summer School in Financial Mathematics 14th edition

European Summer School in Financial Mathematics 14th edition

 30 Aug - 03 Sep 2021

Online

The European Summer School in Financial Mathematics, for its 14th edition, was hosted by the International Centre for Mathematical Sciences (ICMS).

About:

The Summer School brought together talented young researchers in mathematical finance.

The summer school will focus on two advanced courses:
1) Optimal transport methods for economic models and machine learning
2) Signature method in machine learning and its application to mathematical finance.

There were student seminars and discussion sessions which allowed the participants to engage with each other and discuss their current research.
One of the aims of the Summer School was to encourage active cooperation and collaboration in mathematical finance among European institutions. We very much thank the members of the scientific committee for their support in achieving this aim.

This school belonged to the series of the European Mathematical Society applied mathematics schools. We gratefully acknowledge the support of International Centre for Mathematical Sciences (ICMS), CMAP, Ecole Polytechnique (Paris, France), Adam Smith Business School (University of Glasgow), Glasgow Mathematical Journal Learning and Research Support Fund and the ANR program  Investissements d’Avenir.

At the time of this workshop the Organising and Scientific Committee consisted of:

Ankush Agarwal
Gonçalo Dos Reis
Stefano De Marco
Thibaut Mastrolia

The Scientific Committee

The Scientific Committee consists of European leaders and representatives of financial mathematics. We warmly thank them for their encouragement and for accepting to be part of this committee.

Peter Bank, Peter Imkeller, Wolfgang Runggaldier, Mete Soner, Youri Kabanov, Walter Schachermayer, Josef Teichmann, Santiago Carillo, Ralf Korn, Martin Schweizer, Albert Shiryaev,  Nicole El Karoui, Gilles Pagès, Huyen Pham, Marco Frittelli, Damien Lamberton, Bernard Lapeyre, Lukas Stettner, David Hobson, Bernt Øksendal, Denis Talay, Chris Rogers

 

Workshop statistics - 197 registered particpiants

Average daily attendance - 109 particpants

Programme:

Optimal Transport Methods in Machine Learning: from the Sinkhorn algorithm to Generative Adversarial Networks
by Beatrice Acciaio (ETH Zurich, Switzerland)
We start by recalling tools from the classical optimal transport (OT) theory, and then we introduce new developments in OT, specifically what is now called causal optimal transport (COT). We illustrate how the concept of causality in OT is the suitable one in order to tackle dynamic problems, where time plays a crucial role, especially in a financial context. We then consider regularized optimal transport problems, and the Sinkhorn algorithm used for computing entropic OT. Further, we review recent development of generative adversarial networks (GANs), which employ tools from OT theory. We then combine all the above concepts to train a network to generate or predict (financial) time series. Finally, we discuss the results and the numerical challenges.

Optimal Transport Methods for Economic Models
by Alfred Galichon (New York University, USA)

This course is focused on the computation of competitive equilibrium, which is at the core of surge pricing engines and allocation mechanisms. It will investigate diverse applications such as network congestion, surge pricing, and matching platforms. It provides a bridge between theory, empirics and computation and will introduce tools from economics, mathematical and computer science. Mathematical concepts (such as lattice programming, supermodularity, discrete convexity, Galois connections, etc.) will be taught while studying various economic models. The same is true of computational methods (such as ‘tatonnement’ algorithms, asynchronous parallel computation, mathematical programming under equilibrium constraints, etc.).

A Primer on the Signature Method in Machine Learning
by Ilya Chevyrev (University of Edinburgh, UK)

The signature of a path has been recognised in the last few years as a powerful method to store information about a path. At its basic level, the signature is the collection of iterated integrals of a path. This simple definition leads to surprisingly deep properties, which all indicate that the signature is a natural analogue of polynomials on paths. In this minicourse, I will present the definition of the signature and how it arises in several contexts, including control theory and stochastic differential equations. I will demonstrate some of its important properties: these include the shuffle identity, which is responsible for the polynomial-like behaviour on paths, and the Chen identity, which is important for computations. In the last part of the course, I will discuss some recent applications to machine learning, focusing on kernel learning and classification tasks.

Harnessing quantitative finance by deep learning
Blanka Horvath (King's College London, UK) and Mikko Pakkanen (Imperial College London, UK)

Deep learning is currently making headway in the realm of quantitative finance, whilst the financial industry is increasingly embracing data-driven workflows powered by machine learning and data science. In this minicourse, we shall present some of the recent advances of deep learning applied to quantitative finance. After a brief introduction to the basic principles of deep learning, we will explain how it can be applied to derivatives pricing, hedging and market data simulation in a novel way. We will demonstrate these methods by extensive numerical examples.

Differential Machine Learning
Antoine Savine (Danske Bank and Copenhagen University, Denmark)

Differential machine learning (ML) extends supervised learning, with models trained on examples of not only inputs and labels, but also differentials of labels to inputs. Differential ML is applicable in all situations where high quality first order derivatives wrt training inputs are available. In the context of financial derivatives risk management, pathwise differentials are efficiently computed with automatic adjoint differentiation (AAD). Differential ML, combined with AAD, provides extremely effective pricing and risk approximations.  We can produce fast pricing analytics in models too complex for closed form solutions, extract the risk factors of complex transactions and trading books, and effectively compute risk management metrics like reports across a large number of scenarios, backtesting and simulation of hedge strategies, or capital regulations.


Neural Stochastic Differential Equations for Time Series Modelling
James Foster (Oxford University, UK)

Stochastic differential equations (SDEs) are a popular model for describing continuous-time phenomena and have seen particular success in the pricing and hedging of financial derivatives. However, given the current data science revolution and following the seminar paper “Neural Ordinary Differential Equations”, it is natural to investigate how SDE methodologies could be improved using tools from machine learning. This has led to several recent works on so-called “Neural SDEs”, which seek to combine the modelling capabilities of SDEs with the flexibility and efficient training of neural networks. In this talk, I will give an overview of these developments and show how SDEs can be viewed as time series models that fit nicely with well-known ideas from data science, such as generative adversarial networks (GANs).

The course focuses on differential deep learning (DL), arguably the strongest application. We will show how standard DL trains neural networks (NN) on punctual examples, whereas differential DL teaches them the shape of the target function, resulting in vastly improved performance, illustrating it with a number of numerical examples, both idealized and real world. We will also discuss how to apply differential learning to other ML models, like classic regression or principal component analysis (PCA).

Monday
Ilya Chevyrev, University of Edinburgh A Primer on the Signature Method in Machine Learning
Break
Student Talks
Lunch
Ilya Chevyrev, University of Edinburgh A Primer on the Signature Method in Machine Learning
Break
Blanka Horvath, King's College London Harnessing quantitative finance by deep learning
Student Talks
Tuesday
Ilya Chevyrev, University of Edinburgh A Primer on the Signature Method in Machine Learning
Break
Antonie Savine, Danske Bank and Copenhagen University Differential Machine Learning
Lunch
Antoine Savine, Danske Bank and Copenhagen University Differential Machine Learning
Mikko Pakkanen, Imperial College London Harnessing quantitative finance by deep learning
Student Talks
Wednesday
Ilya Chevyrev, University of Edinburgh A Primer on the Signature Method in Machine Learning
James Foster, Oxford University Neural Stochastic Differential Equations for Time Series Modelling
Break
Student Talks
Lunch
Alfred Galichon, New York University Optimal Transport Methods for Economic Models
Break
Alfred Galichon, New York University Optimal Transport Methods for Economic Models
Thursday
Beatrice Acciaio, ETH Zurich Optimal Transport Methods in Machine Learning: from the Sinkhorn algorithm to Generative Adversarial Networks
Break
Pierto Siorpaes
Student Talks
Lunch
Beatrice Acciaio, ETH Zurich Optimal Transport Methods in Machine Learning: from the Sinkhorn algorithm to Generative Adversarial Networks
Break
Alfred Galichon, New York University Optimal Transport Methods for Economic Models
Friday
Beatrice Acciaio, ETH Zurich Optimal Transport Methods in Machine Learning: from the Sinkhorn algorithm to Generative Adversarial Networks
Break
Beatrice Acciaio, ETH Zurich Optimal Transport Methods in Machine Learning: from the Sinkhorn algorithm to Generative Adversarial Networks
Student Talks Prize Talks
Closing Notes

Sponsors and Funders:

  • Glasgow
  • Adam
  • Ems
  • title=
  • title=
  • title=
  • title=