top of page
Search
Writer's pictureDR.GEEK

backpropagation

(1st-February-2021)


• A set of examples for training the network is assembled. Each case consists of a problem statement (which represents the input into the network) and the corresponding solution (which represents the desired output from the network).

• 2. The input data is entered into the network via the input layer.

• 3. Each neuron in the network processes the input data with the resultant values steadily "percolating" through the network, layer by layer, until a result is generated by the output layer.



• 4. The actual output of the network is compared to expected output for that particular input. This results in an error value..The connection weights in the network are gradually adjusted, working backwards from the output layer, through the hidden layer, and to the input layer, until the correct output is produced. Fine tuning the weights in this way has the effect of teaching the network how to produce the correct output for a particular input, i.e. the network learns.

5 views0 comments

Recent Posts

See All

Yorumlar


bottom of page