Lecture 11: Gated Recurrent Units and Further Topics in NMT

Описание к видео Lecture 11: Gated Recurrent Units and Further Topics in NMT

Lecture 11 provides a final look at gated recurrent units like GRUs/LSTMs followed by machine translation evaluation, dealing with large vocabulary output, and sub-word and character-based models. Also includes research highlight ""Lip reading sentences in the wild.""

Key phrases: Seq2Seq and Attention Mechanisms, Neural Machine Translation, Speech Processing

-------------------------------------------------------------------------------

Natural Language Processing with Deep Learning

Instructors:
Chris Manning
Richard Socher

Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.

For additional learning opportunities please visit:
http://stanfordonline.stanford.edu/

Комментарии

Информация по комментариям в разработке