Assignments of the "Natural Language Processing with Deep Learning" course (CS 224n, Stanford University)
This repository contains my solutions to the assignments of the course
of "Natural Language Processing with Deep Learning" (CS224n) taught by Christopher Manning.
at Stanford University. I used the material from
Winter 2021.
Instructor: Sauleh Etemadi
- Assignment 1 - Exploring word vectors
- Assignment 2 - Word2Vec implementation
- Assignment 3 - Dependency Parsing
- Assignment 4 - Neural Machine Translation
- Assignment 5 - Self-Attention, Transformers, and Pretraining
Assignment 1 is about word vectors such as GloVe and similarities between words.
Written Answer
Assignment 2 is an implemetation of Word2Vec algorithm by sgd.
Written Answer
Assignment 3 is about dependency parsing. Implenting a dependency parser.
Written Answer
Assignment 4 is about
training a NMT with Recurrent Neural Networks. You probably need GPU for training. I used google colab.
Written Answer
Assignment 5 is about transformers and self-attention.
Written Answer