Understanding derivatives of activation functions in detail

Описание к видео Understanding derivatives of activation functions in detail

In this video, you will understand how the derivatives of activation functions affects the neural network learning and why we need the values at each layer to be in small range while we train our neural networks.

sigmoid derivative will be between 0 and 0.25 (inclusive)
Tanh derivative will be between 0 and 1 (inclusive)

Link to notebook to plot derivatives of activation functions: https://github.com/ShankarPendse/Deep...

Комментарии

Информация по комментариям в разработке