I think I understood RNNs and the different types of RNNs (excluding LSTM and GRU). Tried implementing the backward propagation for RNN in NumPy and it got really tricky. Since I have started time boxing stuff, so I will be sticking with PyTorch from now on. Did some planning work of what I am planning for the next few days.
Here are some of the things that help me:
- Recurrent Neural Networks - Stanford
- Recurrent Neural Networks, Transformers, and Attention - MIT Intro to Deep Learning
- PyTorch RNN Tutorial - Patrick Loeber
- CS231n Winter 2016: Lecture 10: Recurrent Neural Networks, Image Captioning, LSTM - Andrej Karpathy
- Finding Structure in Time
- Recurrent Neural Network
- RNN Effectiveness - Andrej Karpathy