Leibniz MMS Days 2023 - Abstract
The principles of robust machine learning (ML) under the so-called distribution shift are far from clear. This shift can be a consequence of causal confounding, unfairness due to data biases, and adversarial attacks. In such cases, I will introduce a mathematically principled framework of robustification strategy: distributionally robust optimization (DRO). I will introduce the state-of-the-art tools of DRO for treating machine learning problems under distribution shift with new mathematical tools such as the Wasserstein metric and the kernel maximum mean discrepancy. I will demonstrate that those new geometries provide principled theory, state-of-the-art extensions, as well as practical computational algorithms for robust machine learning.