AI Dynamics

Global AI News Aggregator

Gradient Descent: Adjusting Weights and Biases to Reduce Loss

5: Gradient Descent: We now adjust the weights and biases based on the gradients calculated in the last step. Typically this is done by multiplying the gradient by a small factor called learning rate. The basic idea is to reduce the error or loss. 6/10

→ View original post on X — @abacusai,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *