Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance

Hongjian Wang, Mert Gürbüzbalaban, Lingjiong Zhu, Umut Simsekli, Murat A. Erdogdu. Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance. In Marc'Aurelio Ranzato, Alina Beygelzimer, Yann N. Dauphin, Percy Liang, Jennifer Wortman Vaughan, editors, Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021, December 6-14, 2021, virtual. pages 18866-18877, 2021. [doi]

@inproceedings{WangGZSE21,
  title = {Convergence Rates of Stochastic Gradient Descent under Infinite Noise Variance},
  author = {Hongjian Wang and Mert Gürbüzbalaban and Lingjiong Zhu and Umut Simsekli and Murat A. Erdogdu},
  year = {2021},
  url = {https://proceedings.neurips.cc/paper/2021/hash/9cdf26568d166bc6793ef8da5afa0846-Abstract.html},
  researchr = {https://researchr.org/publication/WangGZSE21},
  cites = {0},
  citedby = {0},
  pages = {18866-18877},
  booktitle = {Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021, December 6-14, 2021, virtual},
  editor = {Marc'Aurelio Ranzato and Alina Beygelzimer and Yann N. Dauphin and Percy Liang and Jennifer Wortman Vaughan},
}