Zeroth-order algorithms for smooth saddle-point problems
Authors
- Sadiev, Abdurakhmon
- Beznosikov, Aleksandr
- Dvurechensky, Pavel
ORCID: 0000-0003-1201-2343 - Gasnikov, Alexander
2020 Mathematics Subject Classification
- 90C30 90C25 68Q25
Keywords
- Zeroth-order optimization, saddle-point problems, stochastic optimization
DOI
Abstract
Saddle-point problems have recently gained an increased attention from the machine learning community, mainly due to applications in training Generative Adversarial Networks using stochastic gradients. At the same time, in some applications only a zeroth-order oracle is available. In this paper, we propose several algorithms to solve stochastic smooth (strongly) convex-concave saddle- point problems using zeroth-order oracles, and estimate their convergence rate and its dependence on the dimension n of the variable. In particular, our analysis shows that in the case when the feasible set is a direct product of two simplices, our convergence rate for the stochastic term is only by a log n factor worse than for the first-order methods. We also consider a mixed setup and develop 1/2th-order methods which use zeroth-order oracle for the minimization part and first-order oracle for the maximization part. Finally, we demonstrate the practical performance of our zeroth-order and 1/2th-order methods on practical problems.
Appeared in
- Mathematical Optimization Theory and Operations Research: Recent Trends, A. Strekalovsky, Y. Kochetov, T. Gruzdeva, A. Orlov , eds., vol. 1476 of Communications in Computer and Information Science book series (CCIS), Springer International Publishing, Basel, 2021, pp. 71--85, DOI 10.1007/978-3-030-86433-0_5 .
Download Documents