Batch Normalization - Part 3: Backpropagation & Inference

Описание к видео Batch Normalization - Part 3: Backpropagation & Inference

We have been discussing Batch Normalization in detail. We have seen why do we need Batch Normalization and we have dig deeper into how Batch Normalization works and also understood the significance of learnable parameters called Gamma and Beta which are Scaling and Shifting. In this video, we will see the Backpropagation for Batch Normalization layer and also see how Batch Normalization works during inference without mini-batch statistics.

Deep Learning Projects playlist:
   • Deep Learning Projects  

Neural Networks From Scratch in Python:
   • Neural Networks From Scratch in Python  

Chapters:
00:00 Introduction
00:24 Recap of BN working
02:16 Computation Graph of BN
03:41 Backpropagation of BN
05:06 Derivative calculation wrt Beta
06:55 Derivative calculation w.r.t Gamma
08:01 Derivative Calculation w.r.t X_norm
09:18 Derivative Calculation w.r.t Variance
12:38 Derivative Calculation w.r.t Mean
16:07 Derivative Calculation w.r.t Xi
19:38 All derivatives equations
20:01 BN during Inference/Testing
22:36 Exponential Moving Avg Calculation of Mean & Variance
25:00 Summary

#batchnormalization #normalization #neuralnetworks #internalcovariateshift #covariance #normaldistribution #backpropagation #gradientdescent

Комментарии

Информация по комментариям в разработке