Forschungsgruppe "Stochastische Algorithmen und Nichtparametrische Statistik"

Seminar "Modern Methods in Applied Stochastics and Nonparametric Statistics" Summer Semester 2021

13.04.21 Yangwen Sun (Humboldt-Universität zu Berlin)
Please note: At 4 pm due to time shift! Graph-spanning ratio test with application to change-point detection problem
20.04.21 Caroline Geiersbach (WIAS Berlin)
Stochastic approximation with applications to PDE-constrained optimization under uncertainty
27.04.21 Peter Friz (WIAS Berlin, TU Berlin)
New perspectives on rough paths, signatures and signature cumulants
We revisit rough paths and signatures from a geometric and "smooth model" perspective. This provides a lean framework to understand and formulate key concepts of the theory, including recent insights on higher-order translation a.k.a. renormalization of rough paths. This first part is joint work with C. Bellingeri (TU Berlin), and S. Paycha (U Potsdam). In a second part, we take a semimartingale perspective and more specifically analyze the structure of expected signatures when written in exponential form. Following Bonnier-Oberhauser (2020), we call the resulting objects signature cumulants. These can be described - and recursively computed - in a way that can be seen as unification of previously unrelated pieces of mathematics, including Magnus (1954), Lyons-Ni (2015), Gatheral and coworkers (2017 onwards) and Lacoin-Rhodes-Vargas (2019). This is joint work with P. Hager and N. Tapia.
04.05.21 N. N.

11.05.21 Paul Hager (TU Berlin)
Optimal stopping with signatures
We propose a new method for solving optimal stopping problems (such as American option pricing in finance) under minimal assumptions on the underlying stochastic process. We consider classic and randomized stopping times represented by linear and non-linear functionals of the rough path signature associated to the underlying process, and prove that maximizing over these classes of signature stopping times, in fact, solves the original optimal stopping problem. Using the algebraic properties of the signature, we can then recast the problem as a (deterministic) optimization problem depending only on the (truncated) expected signature. By applying a deep neural network approach to approximate the non-linear signature functionals, we can efficiently solve the optimal stopping problem numerically. The only assumption on the underlying process is that it is a continuous (geometric) random rough path. Hence, the theory encompasses processes such as fractional Brownian motion, which fail to be either semi-martingales or Markov processes, and can be used, in particular, for American-type option pricing in fractional models, e.g. on financial or electricity markets. This is a joint work with Christian Bayer, Sebastian Riedel and John Schoenmakers.
18.05.21 N.N.

25.05.21 N. N.

01.06.21 N. N.

08.06.21 N. N.

15.06.21 Lennard Henze (HU Berlin)
Classification of quantum dot images using machine learning
A possibility of classifying quantum dot images obtained out of a simulation using a machine learning algorithm and image augmentation techniques is explored. The results on the similar MNIST data set suggest that a classification using machine learning techniques is indeed possible. Furthermore, a theoretical result concerning the extrapolation properties of artificial neural networks will be presented.
22.06.21 N. N.

29.06.21 N. N.

06.07.21 N. N.

13.07.21 N. N.

20.07.21 N. N.

27.07.21 N. N.

03.08.21 N. N.

10.08.21 Darina Dvinskikh (WIAS Berlin)
Decentralized algorithms for Wasserstein barycenters
We consider the Wasserstein barycenter problem of discrete probability measures from computational and statistical sides in two scenarios: (i) the measures are given and we need to compute their Wasserstein barycenter, and (ii) the measures are generated from a probability distribution and we need to calculate the population barycenter of the distribution defined by the notion of Fréchet mean. The statistical focus is estimating the sample size of measures necessary to calculate an approximation for Fréchet mean (barycenter) of a probability distribution with a given precision. For empirical risk minimization approaches, the question of the regularization is also studied together with proposing a new regularization which contributes to the better complexity bounds in comparison with quadratic regularization. The computational focus is developing algorithms for calculating Wasserstein barycenters: both primal and dual algorithms which can be executed in a decentralized manner.
17.08.21 Dr. Jia-Jie Zhu (WIAS Berlin/MPI-IS Tübingen)
Kernel methods for distributionally robust optimization and machine learning
Learning under distribution shift brings new challenges that cannot be addressed by canonical statistical learning theory. In this talk, I will present the use of reproducing kernel Hilbert spaces (RKHS) as the dual spaces for enforcing distributional robustness in stochastic optimization and machine learning. I then prove a generalized strong duality for distributionally robust optimization (DRO) using the integral probability metrics (IPM), of which the type-1 Wasserstein metric is a special instance. Our analysis highlights the roles that smooth function majorants, such as RKHS functions and Moreau envelope, play in enforcing distributional robustness in optimization. Finally, I will introduce a new class of kernel semi-infinite programs that originated from the DRO reformulation.


last reviewed: August 11, 2021 by Christine Schneider