BACK-PROPAGATION
\bˈakpɹˌɒpɐɡˈe͡ɪʃən], \bˈakpɹˌɒpɐɡˈeɪʃən], \b_ˈa_k_p_ɹ_ˌɒ_p_ɐ_ɡ_ˈeɪ_ʃ_ə_n]\
Sort: Oldest first
-
(Or "backpropagation") A learning algorithm for modifying afeed-forward neural network which minimises a continuous"error function" or "objective function."Back-propagation is a "gradient descent" method of trainingin that it uses gradient information to modify the networkweights to decrease the value of the error function onsubsequent tests of the inputs. Other gradient-based methodsfrom numerical analysis can be used to train networks moreefficiently.Back-propagation makes use of a mathematical trick when thenetwork is simulated on a digital computer, yielding in justtwo traversals of the network (once forward, and once back)both the difference between the desired and actual output, andthe derivatives of this difference with respect to theconnection weights.
By Denis Howe
Word of the day
dust storm
- a windstorm that lifts up clouds of dust or sand a windstorm that lifts up clouds dust or sand; "it was the kind of duster not experienced in years"