License: Creative Commons Attribution 3.0 Unported license (CC BY 3.0)
When quoting this document, please refer to the following
DOI: 10.4230/LIPIcs.ITCS.2021.62
URN: urn:nbn:de:0030-drops-136011
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2021/13601/
Go to the corresponding LIPIcs Volume Portal


Cohen, Michael B. ; Sidford, Aaron ; Tian, Kevin

Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

pdf-format:
LIPIcs-ITCS-2021-62.pdf (0.6 MB)


Abstract

We show that standard extragradient methods (i.e. mirror prox [Arkadi Nemirovski, 2004] and dual extrapolation [Yurii Nesterov, 2007]) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained ?_∞ regression based on area convexity [Jonah Sherman, 2017] and complexity bounds achieved by accelerated (randomized) coordinate descent [Zeyuan {Allen Zhu} et al., 2016; Yurii Nesterov and Sebastian U. Stich, 2017] for smooth convex function minimization.

BibTeX - Entry

@InProceedings{cohen_et_al:LIPIcs.ITCS.2021.62,
  author =	{Michael B. Cohen and Aaron Sidford and Kevin Tian},
  title =	{{Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration}},
  booktitle =	{12th Innovations in Theoretical Computer Science Conference (ITCS 2021)},
  pages =	{62:1--62:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-177-1},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{185},
  editor =	{James R. Lee},
  publisher =	{Schloss Dagstuhl--Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/opus/volltexte/2021/13601},
  URN =		{urn:nbn:de:0030-drops-136011},
  doi =		{10.4230/LIPIcs.ITCS.2021.62},
  annote =	{Keywords: Variational inequalities, minimax optimization, acceleration, ?\underline∞ regression}
}

Keywords: Variational inequalities, minimax optimization, acceleration, ?_∞ regression
Collection: 12th Innovations in Theoretical Computer Science Conference (ITCS 2021)
Issue Date: 2021
Date of publication: 04.02.2021


DROPS-Home | Fulltext Search | Imprint | Privacy Published by LZI