MIT 6.S191: Recurrent Neural Networks, Transformers, and Attention

Описание к видео MIT 6.S191: Recurrent Neural Networks, Transformers, and Attention

MIT Introduction to Deep Learning 6.S191: Lecture 2
Recurrent Neural Networks
Lecturer: Ava Amini
** New 2024 Edition **

For all lectures, slides, and lab materials: http://introtodeeplearning.com

Lecture Outline
0:00​ - Introduction
3:42​ - Sequence modeling
5:30​ - Neurons with recurrence
12:20 - Recurrent neural networks
14:08 - RNN intuition
17:14​ - Unfolding RNNs
19:54 - RNNs from scratch
22:41 - Design criteria for sequential modeling
24:24 - Word prediction example
31:50​ - Backpropagation through time
33:40 - Gradient issues
37:15​ - Long short term memory (LSTM)
40:00​ - RNN applications
44:00- Attention fundamentals
46:46 - Intuition of attention
49:13 - Attention and search relationship
51:22 - Learning attention with neural networks
57:45 - Scaling attention and applications
1:00:08 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Комментарии

Информация по комментариям в разработке