After studying this course, students will:
- Understand the definition of a range of neural network models;
- Be able to derive and implement optimisation algorithms for these models
- Understand neural implementations of attention mechanisms and sequence embedding models and how these modular components can be combined to build state of the art NLP systems.
- Have an awareness of the hardware issues inherent in implementing scalable neural network models for language data.
- Be able to implement and evaluate common neural network models for language.
This course will make use of a range of basic concepts from Probability, Linear Algebra, and Continuous Mathematics. Students should have a good knowledge of basic Machine Learning, either from an introductory course or practical experience. No prior linguistic knowledge will be assumed. The course will contain a significant practical component and it will be assumed that participants are proficient programmers.Synopsis
- Introduction/Conclusion: Why neural networks for language and how this course fits into the wider fields of Natural Language Processing, Computational Linguistics, and Machine Learning.
- Simple Recurrent Neural Networks: model definition; the backpropagation through time optimisation algorithm; small scale language modelling and text embedding.
- Advanced Recurrent Neural Networks: Long Short Term Memory and Gated Recurrent Units; large scale language modeling, open vocabulary language modelling and morphology.
- Scale: minibatching and GPU implementation issues.
- Speech Recognition: Neural Networks for acoustic modelling and end-to-end speech models.
- Sequence to Sequence Models: Generating from an embedding; attention mechanisms; Machine Translation; Image Caption generation.
- Question Answering: QA tasks and paradigms; neural attention mechanisms and Memory Networks for QA.
- Advanced Memory: Neural Turing Machine, Stacks and other structures.
- Linguistic models: syntactic and seminatic parsing with recurrent networks.
Recurrent Neural Networks, Backpropagation Through Time, Long Short Term Memory, Attention Networks, Memory Networks, Neural Turing Machines, Machine Translation, Question Answering, Speech Recognition, Syntactic and Semantic Parsing, GPU optimisation for Neural NetworksReading list
As the material covered in this course is based on recent research results there is not a relevant textbook for the area. The readings for the course will thus be based on published papers and online material.