If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks

Arnu Pretorius, Elan Van Biljon, Benjamin van Niekerk, Ryan Eloff, Matthew Reynard, Steven James, Benjamin Rosman, Herman Kamper, Steve Kroon. If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks. Pattern Recognition Letters, 138:95-105, 2020. [doi]

Abstract

Abstract is missing.