[Technion ECE046211 Deep Learning W24] Tutorial 07 - Seq. - Part 2 - Attention and Transformers

Описание к видео [Technion ECE046211 Deep Learning W24] Tutorial 07 - Seq. - Part 2 - Attention and Transformers

Hands-on Tutorial in Python and PyTorch
Technion ECE 046211 Deep Learning Winter 24
Tutorial 07: Sequential Tasks - Part 2 - Attention and Transformers

Jupyter Notebook: https://github.com/taldatech/ee046211...

Open in Colab: https://colab.research.google.com/git...

Topics:
Multi-head Self-Attention, Transformer, BERT and GPT, Teacher Forcing, torchtext, Sentiment Analysis

Chapters:
0:00 Introduction
0:33 The Attention Mechanism
9:30 Attention: Query, Key and Value, Multi-head Attention
21:47 Implementing Multi-head Self-Attention with PyTorch
28:48 The Transformer
32:54 Transformer Encoder: Positional Encoding, AddNorm and MLPs
45:00 Tranformer Classifier for Sentiment Analysis on IMDB with PyTorch
46:25 Transformer Decoder: Cross-Attention, Sequence-to-Sequence
50:09 Pre-trained Models: BERT and GPT
52:37 Vision Transformers (ViTs)

Course GitHub: https://github.com/taldatech/ee046211...
Student Projects Website: https://taldatech.github.io/ee046211-...

Комментарии

Информация по комментариям в разработке