This is very important in Deep Learning: it allows calculating derivatives of highly complex neural networks, composed of stacked layers of simple functions. Gradient descent is used to update the neural net weights, allowing it to learn from data and perform complex tasks.
Backpropagation: Essential for Training Deep Neural Networks
By
–
Leave a Reply