| CARVIEW |
Neural Networks and Deep Learning
Using neural nets to recognize handwritten digits
How the backpropagation algorithm works
- Warm up: a fast matrix-based approach to computing the output from a neural network
- The two assumptions we need about the cost function
- The Hadamard product, $s \odot t$
- The four fundamental equations behind backpropagation
- Proof of the four fundamental equations (optional)
- The backpropagation algorithm
- The code for backpropagation
- In what sense is backpropagation a fast algorithm?
- Backpropagation: the big picture
Improving the way neural networks learn
A visual proof that neural nets can compute any function
Why are deep neural networks hard to train?
Appendix: Is there a simple algorithm for intelligence?
If you benefit from the book, please make a small donation. I suggest $5, but you can choose the amount.
Alternately, you can make a donation by sending me Bitcoin, at address 1Kd6tXH5SDAmiFb49J9hknG5pqj7KStSAx
Sponsors
Thanks to all the supporters who made the book possible, with especial thanks to Pavel Dudrenov. Thanks also to all the contributors to the Bugfinder Hall of Fame.
Resources
Michael Nielsen's project announcement mailing list
Deep Learning, book by Ian Goodfellow, Yoshua Bengio, and Aaron Courville
By Michael Nielsen / Dec 2019
Neural Networks and Deep Learning is a free online book. The book will teach you about:
- Neural networks, a beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data
- Deep learning, a powerful set of techniques for learning in neural networks
For more details about the approach taken in the book, see here. Or you can jump directly to Chapter 1 and get started.
This work is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License. This means you're free to copy, share, and build on this book, but not to sell it. If you're interested in commercial use, please contact me. Last update: Thu Dec 26 15:26:33 2019