"A gentle introduction to the principles behind neural networks, including backpropagation. Rated G for general audiences."
This very well done. If you have a quantitative background you can watch it at 1.5x or 2x speed, I think :-)
A bit more on the history of backpropagation and convexity: why is the error function convex, or nearly so?
No comments:
Post a Comment