top of page
Search
Writer's pictureDR.GEEK

Backpropagation

(10th-June-2020)


• Backpropagation is a method to calculate the gradient of the loss function with respect to the weights in an artificial neural network. It is commonly used as a part of algorithms that optimize the performance of the network by adjusting the weights, for example in the gradient descent algorithm. It is also called backward propagation of errors.

• Often the term "backpropagation" is used to refer to the entire procedure of gradient calculation and its use in optimization. However backpropagation is merely the calculation of the gradient using elementary calculus, independent of the optimization algorithm.

Back-propagation learning is gradient descent search through the parameter space to minimize the sum-of-squares error.



1 view0 comments

Recent Posts

See All

Comments


bottom of page