A Universally Optimal Multistage Accelerated Stochastic Gradient Method

Necdet Serhat Aybat, Alireza Fallah, Mert Gürbüzbalaban, Asuman E. Ozdaglar. A Universally Optimal Multistage Accelerated Stochastic Gradient Method. In Hanna M. Wallach, Hugo Larochelle, Alina Beygelzimer, Florence d'Alché-Buc, Edward A. Fox, Roman Garnett, editors, Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada. pages 8523-8534, 2019. [doi]

@inproceedings{AybatFGO19,
  title = {A Universally Optimal Multistage Accelerated Stochastic Gradient Method},
  author = {Necdet Serhat Aybat and Alireza Fallah and Mert Gürbüzbalaban and Asuman E. Ozdaglar},
  year = {2019},
  url = {http://papers.nips.cc/paper/9059-a-universally-optimal-multistage-accelerated-stochastic-gradient-method},
  researchr = {https://researchr.org/publication/AybatFGO19},
  cites = {0},
  citedby = {0},
  pages = {8523-8534},
  booktitle = {Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada},
  editor = {Hanna M. Wallach and Hugo Larochelle and Alina Beygelzimer and Florence d'Alché-Buc and Edward A. Fox and Roman Garnett},
}