Stochastic Second-Order Methods Improve Best-Known Sample Complexity of SGD for Gradient-Dominated Functions

Saeed Masiha, Saber Salehkaleybar, Niao He, Negar Kiyavash, Patrick Thiran. Stochastic Second-Order Methods Improve Best-Known Sample Complexity of SGD for Gradient-Dominated Functions. In Sanmi Koyejo, S. Mohamed, A. Agarwal, Danielle Belgrave, K. Cho, A. Oh, editors, Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022. 2022. [doi]

@inproceedings{MasihaSHKT22,
  title = {Stochastic Second-Order Methods Improve Best-Known Sample Complexity of SGD for Gradient-Dominated Functions},
  author = {Saeed Masiha and Saber Salehkaleybar and Niao He and Negar Kiyavash and Patrick Thiran},
  year = {2022},
  url = {http://papers.nips.cc/paper_files/paper/2022/hash/46323351ebc2afa42b30a6122815cb95-Abstract-Conference.html},
  researchr = {https://researchr.org/publication/MasihaSHKT22},
  cites = {0},
  citedby = {0},
  booktitle = {Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, New Orleans, LA, USA, November 28 - December 9, 2022},
  editor = {Sanmi Koyejo and S. Mohamed and A. Agarwal and Danielle Belgrave and K. Cho and A. Oh},
  isbn = {9781713871088},
}