Vectors of Cognitive AI: Attention

Описание к видео Vectors of Cognitive AI: Attention

Panelists: Michael Graziano, Jonathan Cohen, Vasudev Lal, Joscha Bach

The seminal contribution "Attention is all you need" (Vasvani et al. 2017), which introduced the Transformer algorithm, triggered a small revolution in machine learning. Unlike convolutional neural networks, which construct each feature out of a fixed neighborhood of signals, Transformers learn which data a feature on the next layer of a neural network should attend to. However, attention in neural networks is very different from the integrated attention in a human mind. In our minds, attention seems to be part of a top-down mechanism that actively creates a coherent, dynamic model of reality, and plays a crucial role in planning, inference, reflection and creative problem solving. Our consciousness appears to be involved in maintaining the control model of our attention.

In this panel, we want to discuss avenues into our understanding of attention, in the context of machine learning, cognitive science and future developments of AI.

Full program and references: https://cognitive-ai-panel.webflow.io...

Комментарии

Информация по комментариям в разработке