Relative Position Bias (+ PyTorch Implementation)

Описание к видео Relative Position Bias (+ PyTorch Implementation)

In this video, I explain why position embedding is required in vision transformers, what's the limitation of using absolute position embedding, and how relative position bias can improve that.

Table of Content:
00:00 Permutation Equivariance
01:12 Absolute Position Embedding
02:42 Limitation of absolute positions
03:56 Relative Position Bias intuition
07:57 Relative Position Bias in theory
12:53 PyTorch Implementation

Icon made by Freepik from flaticon.com

Комментарии

Информация по комментариям в разработке