AI Dynamics

Global AI News Aggregator

Learning Rate Impact on Weight Updates in Neural Networks

Now Update the Weights: Here learning rate is the hyper parameter! A low learning rate can cause the model getting caught in local optima, while the high learning rate can cause the model to overshoot the general solution W1 += learning_rate * d_W1
b1 += learning_rate * d_b1

→ View original post on X — @sumanth_077,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *