🍃 This repo contains my paper reading notes on NLP and some toy project code, kaggle writeups etc.
Title | Field | Time | Report link | Time I started | Status |
---|---|---|---|---|---|
[NeurIPS 2017] Attention Is All You Need | NLP | 2017 | https://bcli.me/blog/transformer | 2021/12/11 | Done |
[NAACL 2019] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | NLP | 2018 | https://bcli.me/blog/bert | 2021/12/15 | Done |
[NeurIPS 2014] Sequence to Sequence Learning with Neural Networks | NLP | 2014 | https://bcli.me/blog/seq2seq | 2022/1/21 | Done |
[ICLR 2018] Non-Autoregressive Neural Machine Translation | NLP | 2018 | https://bcli.me/blog/nonauto | 2022/1/24 | 60% |
[ICLR 2019] Parameter-Efficient Transfer Learning for NLP | NLP | 2019 | https://bcli.me/blog/petl | - | Pending |
[ICLR 2018] Unsupervised Neural Machine Translation | NLP | 2018 | https://bcli.me/blog/unsupervised-NMT | - | Pending |
more to be added |
写 blog 的时候如未特殊说明则为从约为零基础开始。在 blog post 中我会把我为了理解文中一些比较 specific 的概念找到的相对容易理解的原出处贴到文中,方便查阅,且不再重复阐述。
关于为啥要写进度:随时 Update 下证明我不会咕掉...