Computational Analysis and Understanding of Natural Languages: Principles, Methods and Applications by Venkat N. Gudivada & C.R. Rao

Computational Analysis and Understanding of Natural Languages: Principles, Methods and Applications by Venkat N. Gudivada & C.R. Rao

Author:Venkat N. Gudivada & C.R. Rao
Language: eng
Format: epub
ISBN: 9780444640437
Publisher: Elsevier Ltd.
Published: 2018-08-27T16:00:00+00:00


5.1 Parameter Norm Penalties

Linear models such as linear regression and logistic regression are straightforward and effective regularization strategies which have been used prior to the advent of DL.

As used in many regularization approaches, by adding a parameter norm penalty to the objective function, the capacity of the model becomes limited. In these approaches, training algorithm minimizes both original objective function on the training data and some measures of the size of a single parameter or a subset of the parameters. Here, we briefly discuss the effect of different norms on the model parameters as they are used as penalties. It should be noted that for the neural networks, typically we use a parameter norm penalty that penalizes only the weights of the affine transformation at each layer. Since the biases basically require less data to fit accurately in comparison with the weights, we leave the biases unregularized. The reason is that weights indicate the interaction between two variables whereas the biases control only a single variable. Moreover, regularizing the biases can result in a significant amount of underfitting.

Considering different penalties for the layers of a neural network may be useful, however, it makes the computations more expensive and using same α coefficients for all layers is still reasonable.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.