AI Seminar Series 2024: Learning Continually by Spectral Regularization, Alex Lewandowski

Описание к видео AI Seminar Series 2024: Learning Continually by Spectral Regularization, Alex Lewandowski

The AI Seminar is a weekly meeting at the University of Alberta where researchers interested in artificial intelligence (AI) can share their research. Presenters include both local speakers from the University of Alberta and visitors from other institutions. Topics can be related in any way to artificial intelligence, from foundational theoretical work to innovative applications of AI techniques to new fields and problems.

Abstract:
Neural networks can become less trainable over the course of learning, a phenomenon referred to as loss of plasticity. This talk will describe a spectral perspective on neural network trainability. At initialization, the distribution of singular values for the neural network parameters is relatively uniform. Over the course of learning, the maximum singular value (spectral norm) grows with the number of updates performed by the learning algorithm. We propose spectral regularization, which regularizes the spectral norm, to maintain the spectral properties present at initialization. Our experiments across a wide variety of datasets, architectures and non-stationarities demonstrate that spectral regularization is both effective and insensitive to hyperparameters.

Presenter Bio:
Alex Lewandowski is a PhD student supervised by Dale Schuurmans and Marlos C. Machado at the University of Alberta. His research interest is in understanding scalable continual learning algorithms, with the goal of developing algorithms capable of learning autonomously at every scale.

Комментарии

Информация по комментариям в разработке