Wednesday, May 17, 2017

Training Neural Networks with Backpropagation. Original Publication.

Neural networks have been a very important area of scientific study that has evolved by different disciplines such as mathematics, biology, psychology, computer science, etc.
The study of neural networks leapt from theory to practice with the emergence of computers.
Training a neural network by adjusting the weights of the connections is computationally very expensive so its application to practical problems took until the mid-80s when a more efficient algorithm was discovered.

That algorithm is now known as back-propagation errors or simply backpropagation.

One of the most cited articles on this algorithm is:

Learning representations by back-propagating errors
David E. Rumelhart*, Geoffrey E. Hinton & Ronald J. Williams*
Nature 323, 533 - 536 (09 October 1986)



Although it is a very technical article, anyone who wants to study and understand neural networks is obliged to pass through this material.

I share the entire article in:
https://github.com/pakinja/Data-R-Value