If a learning rate is set too high, what could potentially happen during training?

Disable ads (and more) with a premium pass for a one time $4.99 payment

Study for the Google Cloud Professional Machine Learning Engineer Test. Study with flashcards and multiple choice questions, each question has hints and explanations. Get ready for your exam!

Setting the learning rate too high can lead to the model learning a sub-optimal set of weights too quickly or even becoming unstable. A high learning rate can cause the model to overshoot the optimal values during the training process, resulting in large fluctuations in the loss function. Instead of gradually converging towards the minimum, the model may bounce around and fail to settle into a conducive range for optimal weight selection.

This instability can manifest as the loss function increasing instead of decreasing, or the model diverging altogether, making it incapable of finding good solutions. In contrast, an appropriate learning rate allows the model to update its weights progressively, improving performance and achieving convergence without erratically overshooting optimal values.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy