How to explain Q, K and V of Self Attention in Transformers (BERT)?

Описание к видео How to explain Q, K and V of Self Attention in Transformers (BERT)?

How to explain Q, K and V of Self Attention in Transformers (BERT)?
Thought about it and present here my most general approach to explain the history of the notation of Query, Key and Values and how they combine for classical self attention mechanism in transformers.

#ai
#self_attention
#bert

Комментарии

Информация по комментариям в разработке