Research Group "Stochastic Algorithms and Nonparametric Statistics"

Seminar "Modern Methods in Applied Stochastics and Nonparametric Statistics" Winter Semester 2020/2021

01.09.2020 Franziska Bielert (Humboldt-Universität zu Berlin)
Estimation of quadratic functionals in a non-parametric regression model and gaussian white noise model
27.10.2020 Carsten Hartmann (BTU Cottbus-Senftenberg)
Convergence to equilibrium in underdamped Langevin dynamics: Fluctuation-dissipation relations, entropy production and optimal control
We study the convergence to equilibrium of an underdamped Langevin equation that is controlled by a linear feedback force. Specifically, we are interested in sampling the possibly multimodal invariant probability distribution of a Langevin system at small noise (or low temperature) that is characterised by metastability and slow convergence to equilibrium. We follow an approach proposed by Pavon and co-workers [1] and consider a Langevin equation that is simulated at a high temperature, with the control playing the role of a friction that balances the additional noise so as to restore the original invariant measure at a lower temperature. We discuss different limits as the temperature ratio goes to infinity and prove convergence to a limit dynamics. It turns out that, depending on whether the lower (``target'') or the higher (``simulation'') temperature is fixed, the controlled dynamics converges either to the overdamped Langevin equation or to a deterministic gradient flow. This implies that (a) the ergodic limit and the large temperature separation limit do not commute in general, and that (b) it is not possible to accelerate the speed of convergence to the ergodic limit by making the temperature separation larger and larger. We discuss the implications of these observation from the perspective of stochastic optimisation algorithms and enhanced sampling schemes in molecular dynamics. This is joint work with Tobias Breiten, Lara Neureither and Upanshu Sharma [2]. [1] Y. Chen, T.T. Georgiou and M. Pavon. Fast cooling for a system of stochastic oscillators. J. Math. Phys. 56:113302, 2015. [2] T. Breiten, C. Hartmann, L. Neureither and U. Sharma. Stochastic gradient descent and fast relaxation to thermodynamic equilibrium: a stochastic control approach. Preprint, 2020.
03.11.2020 Darina Dvinskikh (WIAS Berlin)
Improved complexity bounds in Wasserstein barycenter problem
We focus on computational aspects of Wasserstein barycenter problem. We provide two algorithms to compute Wasserstein barycenter. The first algorithm, based on mirror prox with some specific norm, meets the complexity of celebrated accelerated iterative Bregman projections (IBP), however, with no limitations unlike (accelerated) IBP, that is numerically unstable when regularization parameter is small. The second algorithm, based on area-convexity and dual extrapolation, improves the previously best-known convergence rates for Wasserstein barycenter problem.
10.11.2020 Simon Breneis (WIAS Berlin)
Functions of bounded variation in one and multiple dimensions
We first recall univariate functions of bounded variation and treat a new result on the Hölder continuities of functions of bounded variation and their variation functions. Then, we discuss possible extensions of the concept of bounded variation to functions of several (though finitely many) variables, compare those generalizations and discuss their uses and shortcomings.
17.11.2020 n.n.

24.11.2020 Alexander Gasnikov (WIAS Berlin)
Parallel and distributed algorithms for ML problems
01.12.2020 Roman Kravchenko (HU Berlin)
Distributed optimization with quantization for computing Wasserstein barycenters
08.12.2020 n.n.

15.12.2020 n.n.

12.01.2021 Pavel Dvurechensky (WIAS Berlin)
Accelerated alternating minimization methods
We combine alternating minimization (AM) and Nesterov-type momentum acceleration and propose a generic accelerated alternating minimization method with a $1/k^2$ convergence rate in terms of the objective for convex problems and $1/k$ in terms of the squared gradient norm for non-convex problems, where $k$ is the iteration counter. Our method does not require any knowledge of neither convexity of the problem nor function parameters such as smoothness constant, i.e. it is adaptive to convexity and smoothness. Further, we develop its primal-dual modification for convex problems with linear constraints. We consider two applications of our methods to highlight their properties. The first one is the non-convex collaborative filtering problem, and the second one is optimal transport, where our primal-dual method takes a form of accelerated Sinkhorn's algorithm or accelerated Iterative Bregman Projections algorithm. In both applications, we show numerically that our methods outperform the baselines. In the second application, we also obtain state-of-the-art complexity results for optimal transport problems.
19.01.2021 Dmitrii M. Ostrovskii (University of Southern California)
Please note: At 4 pm due to time shift! Near-optimal model discrimination with non-disclosure (online talk)

last reviewed:January 11, 2021 by Christine Schneider