Pablo Barceló: Attention is Turing Complete

Описание к видео Pablo Barceló: Attention is Turing Complete

Talk given by Pablo Barceló to the Formal Languages and Neural Networks discord on the 26th of September 2022. Thank you Pablo!

Papers and resources mentioned during the talk and following discussion:
Main Paper:
Attention is Turing Complete (Pérez et al, 2021): https://jmlr.org/papers/v22/20-302.html

References:

From the Discussion:
Are Transformers universal approximators of sequence-to-sequence functions? (Yun et al, 2019): https://arxiv.org/abs/1912.10077

Комментарии

Информация по комментариям в разработке