Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent

Jaehoon Lee, Lechao Xiao, Samuel S. Schoenholz, Yasaman Bahri, Roman Novak, Jascha Sohl-Dickstein, Jeffrey Pennington. Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent. In Hanna M. Wallach, Hugo Larochelle, Alina Beygelzimer, Florence d'Alché-Buc, Edward A. Fox, Roman Garnett, editors, Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada. pages 8570-8581, 2019. [doi]

@inproceedings{LeeXSBNSP19,
  title = {Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent},
  author = {Jaehoon Lee and Lechao Xiao and Samuel S. Schoenholz and Yasaman Bahri and Roman Novak and Jascha Sohl-Dickstein and Jeffrey Pennington},
  year = {2019},
  url = {http://papers.nips.cc/paper/9063-wide-neural-networks-of-any-depth-evolve-as-linear-models-under-gradient-descent},
  researchr = {https://researchr.org/publication/LeeXSBNSP19},
  cites = {0},
  citedby = {0},
  pages = {8570-8581},
  booktitle = {Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada},
  editor = {Hanna M. Wallach and Hugo Larochelle and Alina Beygelzimer and Florence d'Alché-Buc and Edward A. Fox and Roman Garnett},
}