Skip to content

Latest commit

 

History

History
25 lines (22 loc) · 1.35 KB

README.md

File metadata and controls

25 lines (22 loc) · 1.35 KB

CS224n: Natural Language Processing with Deep Learning

This course provides a thorough introduction to cutting-edge research in deep learning applied to NLP: word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some recent models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.

Syllabus

  1. Introduction to NLP and Deep Learning.
  2. Word Vector Representations: word2vec.
  3. Advanced Word Vector Representations.
  4. Word Window Classification and Neural Networks.
  5. Backpropagation and Project Advice.
  6. Dependency Parsing.
  7. Introduction to TensorFlow.
  8. Recurrent Neural Networks and Language Models.
  9. Machine translation and advanced recurrent LSTMs and GRUs.
  10. Neural Machine Translation and Models with Attention.
  11. Gated recurrent units and further topics in NMT.
  12. End-to-end models for Speech Processing.
  13. Convolutional Neural Networks.
  14. Tree Recursive Neural Networks and Constituency Parsing.
  15. Coreference Resolution.
  16. Dynamic Neural Networks for Question Answering.
  17. Issues in NLP and Possible Architectures for NLP.
  18. Tackling the Limits of Deep Learning for NLP.