Had taken a one-day break yesterday and re-read Deep Work.
Implemented backpropagation from scratch today by using only NumPy. After a long time seemed to have made the most use of my intellectual powers. Had to view/read multiple videos/articles for it to finally click in my head. I learned today that forward gradients can do the same thing but they are computationally significantly expensive as compared to computing gradients "going backwards".
Some of the things that helped:
- VikParuchuri's Neural Network from scratch
- Backpropagation - Colah's blog
- Michigan's DL for Vision - Backpropagation
- Gilbert Strang's Backpropagation: Find partial derivatives
- NumPy Neural Networks computational graphs - KDnuggets
- 3Blue1Brown - Backpropagation calculus
Found an interesting resource on the path Vikas Paruchuri took to learn deep learning: https://www.vikas.sh/post/how-i-got-into-deep-learning