Week3: Paper: Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention

Описание к видео Week3: Paper: Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention

May 8 (Wed) 19:00 - 20:00PM (GMT +6:30)
Presenter - Thura Aung

Thura Aung is currently studying B.Eng. in Software Engineering at the Faculty of Computer Engineering, School of Engineering, King Mongkut's Institute of Technology Ladkrabang (KMITL) in Bangkok, Thailand. Under the supervision of Prof. Ye Kyaw Thu, he is a member of the Language Understanding Lab, Myanmar, dedicated to Myanmar language processing R&D. Thura contributes to building high-quality corpora and lightweight language processing systems. Additionally, he volunteers as a Machine Learning Engineer at the Omdena since November 2021, contributing to Computer Vision and NLP seasonal projects. He has published two research papers, and ongoing work on extending these publications is in progress. His areas of research interest include Artificial Intelligence (AI), Natural Language Processing (NLP), and Software Engineering.

#MLPaperReadingClubs #mlpaperreadingclubs

Комментарии

Информация по комментариям в разработке