WIAS Preprint No. 2711, (2020)

Oracle complexity separation in convex optimization



Authors

  • Ivanova, Anastasiya
  • Gasnikov, Alexander
  • Dvurechensky, Pavel
    ORCID: 0000-0003-1201-2343
  • Dvinskikh, Darina
  • Tyurin, Alexander
  • Vorontsova, Evgeniya
  • Pasechnyuk, Dmitry

2010 Mathematics Subject Classification

  • 90C30 90C25 68Q25 65K15

Keywords

  • Convex optimization, composite optimization, proximal method, acceleration, random coordinate descent, variance reduction

DOI

10.20347/WIAS.PREPRINT.2711

Abstract

Ubiquitous in machine learning regularized empirical risk minimization problems are often composed of several blocks which can be treated using different types of oracles, e.g., full gradient, stochastic gradient or coordinate derivative. Optimal oracle complexity is known and achievable separately for the full gradient case, the stochastic gradient case, etc. We propose a generic framework to combine optimal algorithms for different types of oracles in order to achieve separate optimal oracle complexity for each block, i.e. for each block the corresponding oracle is called the optimal number of times for a given accuracy. As a particular example, we demonstrate that for a combination of a full gradient oracle and either a stochastic gradient oracle or a coordinate descent oracle our approach leads to the optimal number of oracle calls separately for the full gradient part and the stochastic/coordinate descent part.

Appeared in

Download Documents