This course provides a thorough introduction to cutting-edge research in deep learning applied to NLP: word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some recent models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.
- Introduction to NLP and Deep Learning.
- Word Vector Representations: word2vec.
- Advanced Word Vector Representations.
- Word Window Classification and Neural Networks.
- Backpropagation and Project Advice.
- Dependency Parsing.
- Introduction to TensorFlow.
- Recurrent Neural Networks and Language Models.
- Machine translation and advanced recurrent LSTMs and GRUs.
- Neural Machine Translation and Models with Attention.
- Gated recurrent units and further topics in NMT.
- End-to-end models for Speech Processing.
- Convolutional Neural Networks.
- Tree Recursive Neural Networks and Constituency Parsing.
- Coreference Resolution.
- Dynamic Neural Networks for Question Answering.
- Issues in NLP and Possible Architectures for NLP.
- Tackling the Limits of Deep Learning for NLP.