Learning Rate

The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated.

Too large

When your learning rate is too large a model is basically jumping from one side of the valley to the other side. This makes training very unstable and will have a negative impact on your weights.

Too low

Having a low learning rate will result in your model taking ages to train and maybe even getting stuck.

Last updated

Was this helpful?