- Slides
- CS231 lecture on RNNs - https://www.youtube.com/watch?v=iX5V1WpxxkY
- Our lecture, seminar
- [alternative] Brief lecture on RNN by nervana - https://www.youtube.com/watch?v=Ukgii7Yd_cU
- [alternative] More detailed lecture by Y. Bengio - https://www.youtube.com/watch?v=xK-bzjIQkmM
- Great reading by Karpathy - http://karpathy.github.io/2015/05/21/rnn-effectiveness/
- LSTM explained in detail by colah - http://colah.github.io/posts/2015-08-Understanding-LSTMs/
- Seq2seq lecture - https://www.youtube.com/watch?v=G5RY_SUJih4
- "Awesome rnn" entry point - https://github.com/kjw0612/awesome-rnn
- OpenAI research on sentiment analysis that sheds some light on what's inside LSTM language model.
You guessed, two options
Follow the first notebook and implement a simple character-level RNN with pure lasagne. The homework part (4 points) is at the very end of that notebook.
Proceed with seq2seq notebook for the second part of homework assignment (6 points).
In this assignment, you will need to implement two things (pts are same):
- A generative RNN model for one of datasets below or for your custom dataset (anything from clickbait to pokemon names)
- A conditional generative model for either [formula]->[common_name] task for molecules dataset below or image captioning [or similar custom dataset].
Some helper materials:
- CS231 rnn assignment
- "Deep models for text and sequences" section of this course