Xavier Glorot
(3 articles)1986
Weight Initialization
An essential process in neural network training that involves setting the initial values of the model's weights to influence learning effectiveness and convergence.
Generality: 675
2010
Xavier's Initialization
Weight initialization technique designed to keep the variance of the outputs of a neuron approximately equal to the variance of its inputs across layers in a deep neural network.
Generality: 669
2015
Variance Scaling
Variance scaling is a technique used in machine learning to ensure weights of layers are initialized in a way that maintains consistent variance of activations throughout a neural network.
Generality: 525