Implementing GELU and Its Derivative from Scratch

Описание к видео Implementing GELU and Its Derivative from Scratch

In this video, we discuss and implement GELU activation function and its derivative using PyTorch.

Codebase: https://github.com/oniani/ai
GitHub: https://github.com/oniani
Web: https://oniani.org

#ai #softwareengineering #programming #stylepoint #gelu

Chapters
0:00 - Intro
0:39 - Discussing GELU
9:24 - Computing the derivative of GELU
11:19 - Implementing `forward` method
12:33 - Implementing `backward` method
13:42 - Using `gradcheck` for testing
14:12 - The alternative implementation
15:20 - Outro

Комментарии

Информация по комментариям в разработке