Vanishing Gradient Problem || Quickly Explained

Описание к видео Vanishing Gradient Problem || Quickly Explained

Vanishing Gradient Problem occurs when the information about gradients starts to fading as it performs backpropagation in a deep neural network.

That's why it wasn't used before 2006 when some researchers fixed it.

In this video, we'll see how it actually occurs during training with help of little mathematics and also proves it experimentally with Keras and TensorFlow.

If you have any query regarding this, please comment down below and if you don't have one kindly leave your feedback, it means a lot to me.

Thanks and regards.

For more mathematical explanation: https://mattmazur.com/2015/03/17/a-st...

I'm available for your queries, ask me at:
Facebook:   / developershutt  
Instagram: https://www.instagram.com.developershutt

Комментарии

Информация по комментариям в разработке