Log-normal Parameters

Has anybody given a serious go at using log-normal initializer for deep neural network parameters? Seems likely to make sense. Additionally, one can also do batch- and layer- log-normalization on layer activations.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s