Thanks to visit codestin.com
Credit goes to github.com

Skip to content

legalaspro/rnn_gru_lstm_experiments

Repository files navigation

Deep Reinforcement Learning with RNN

In this project, I explored the blog by Andrej Karpathy about Recurrent Neural Networks (RNNs). Additionally, I delved into Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) networks out of my own interest. I implemented these models using PyTorch and also performed backpropagation from scratch using Numpy. A valuable lesson learned with Numpy was the importance of setting the initial weights correctly.

Implementations

  1. Recurrent Neural Networks (RNNs)

    • Studied the theoretical concepts from Karpathy's blog.
    • Implemented RNNs using PyTorch.
    • Performed backpropagation from scratch using Numpy.
  2. Gated Recurrent Units (GRUs)

    • Explored the architecture and benefits of GRUs.
    • Implemented GRUs using PyTorch.
    • Conducted backpropagation from scratch using Numpy.
  3. Long Short-Term Memory (LSTM) Networks

    • Investigated the structure and advantages of LSTMs.
    • Implemented LSTMs using PyTorch.
    • Executed backpropagation from scratch using Numpy.

Conclusion

Through this project, I gained a deeper understanding of various recurrent neural network architectures and their implementations. The hands-on experience with PyTorch and Numpy reinforced my learning and provided valuable insights into the workings of RNNs, GRUs, and LSTMs. A valuable lesson learned with Numpy was the importance of setting the initial weights correctly.

About

RNN, GRU, LSTM implementation using PyTorch and Numpy

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published