WIAS Preprint No. 3032, (2023)

Kernel mirror prox and RKHS gradient flow for mixed functional Nash equilibrium



Authors

  • Dvurechensky, Pavel
    ORCID: 0000-0003-1201-2343
  • Zhu, Jia-Jie

2020 Mathematics Subject Classification

  • 90C90 68Q25 65K15

Keywords

  • Gradient flow, distributionally robust optimization, Nash equilibrium, optimization, kernel methods, generative models

DOI

10.20347/WIAS.PREPRINT.3032

Abstract

Kernel mirror prox and RKHS gradient flow for mixed functional Nash equilibrium Pavel Dvurechensky , Jia-Jie Zhu Abstract The theoretical analysis of machine learning algorithms, such as deep generative modeling, motivates multiple recent works on the Mixed Nash Equilibrium (MNE) problem. Different from MNE, this paper formulates the Mixed Functional Nash Equilibrium (MFNE), which replaces one of the measure optimization problems with optimization over a class of dual functions, e.g., the reproducing kernel Hilbert space (RKHS) in the case of Mixed Kernel Nash Equilibrium (MKNE). We show that our MFNE and MKNE framework form the backbones that govern several existing machine learning algorithms, such as implicit generative models, distributionally robust optimization (DRO), and Wasserstein barycenters. To model the infinite-dimensional continuous- limit optimization dynamics, we propose the Interacting Wasserstein-Kernel Gradient Flow, which includes the RKHS flow that is much less common than the Wasserstein gradient flow but enjoys a much simpler convexity structure. Time-discretizing this gradient flow, we propose a primal-dual kernel mirror prox algorithm, which alternates between a dual step in the RKHS, and a primal step in the space of probability measures. We then provide the first unified convergence analysis of our algorithm for this class of MKNE problems, which establishes a convergence rate of O(1/N ) in the deterministic case and O(1/√N) in the stochastic case. As a case study, we apply our analysis to DRO, providing the first primal-dual convergence analysis for DRO with probability-metric constraints.

Appeared in

  • Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, S. Dasgupta, S. Mandt, Y. Li, eds., vol. 238 of Proceedings of Machine Learning Research, 2024, pp. 2350--2358.

Download Documents