Definitions
from The American Heritage® Dictionary of the English Language, 5th Edition.
- noun A common method of training a neural net in which the initial system output is compared to the desired output, and the system is adjusted until the difference between the two is minimized.
from Wiktionary, Creative Commons Attribution/Share-Alike License.
- noun computing An error correction technique used in
neural networks - noun neurology A phenomenon in which the
action potential of a neuron creates a voltage spike both at the end of theaxon , as normally, and also back through to thedendrites from which much of the original input current originated.
Etymologies
from Wiktionary, Creative Commons Attribution/Share-Alike License
Support
Help support Wordnik (and make this page ad-free) by adopting the word backpropagation.
Examples
-
There are several algorithms used in neural networks, one of them is backpropagation, which is the one I used.
-
Connectionist learning techniques such as backpropagation are far from explaining this kind of ˜one shot™ learning.
Connectionism Garson, James 2007
-
One of the most widely used of these training methods is called backpropagation.
Connectionism Garson, James 2007
-
Success with backpropagation and other connectionist learning methods may depend on quite subtle adjustment of the algorithm and the training set.
Connectionism Garson, James 2007
-
Furthermore, it is far from clear that the brain contains the kind of reverse connections that would be needed if the brain were to learn by a process like backpropagation, and the immense number of repetitions needed for such training methods seems far from realistic.
Connectionism Garson, James 2007
-
“We need diversity, symbiosis, backpropagation and emergence.”
Food for thought 2004
-
Good results are often observed empirically by applying backpropagation for nonlinear architectures, with the danger of local minima understood.
-
DBN in an unsupervised manner and fine tuned it with backpropagation.
Planet Lisp 2010
-
Also the part about neural networks is simplistic, to say the least: what about the learning model and backpropagation?
-
Basically, what the backpropagation algorithm does is to propagate backwards the error obtained in the output layer while comparing the calculated value in the nodes to the real or desired value.
Comments
Log in or sign up to get involved in the conversation. It's quick and easy.